datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
rolofapp/beta1 | ---
license: other
---
|
rjds0207/BetoStone | ---
license: openrail
---
|
arbml/alpagasus_cleaned_ar_reviewed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_en
dtype: string
- name: index
dtype: string
- name: instruction_en
dtype: string
- name: output
dtype: string
- name: output_en
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 3037648
num_examples: 2959
download_size: 0
dataset_size: 3037648
---
# Dataset Card for "alpagasus_cleaned_ar_reviewed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rmndrnts/MenoSet | ---
license: apache-2.0
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
A dataset invented for Meno multimodal LLM. Contains questions related to three modalities - audio, visual and text.
## Dataset Details
[You can load images and audios from here](https://disk.yandex.ru/d/0hQ7Mbyj8GPBCQ)
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
A dataset of intellectual questions chained together was collected based on open
data from the game ”What? Where? When?” and questions on arbitrary topics.
For each question in the dataset, there are from one to several possible answers
that are acceptable in a dialogue with the model. The dataset is intended for
language model fine-tuning and improving the quality of user dialogue.
The dataset consists of 51 dialogues that combine three different modalities
and require erudition to answer.
TODO:
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
|
CyberHarem/yuzuriha_jigokuraku | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yuzuriha_jigokuraku
This is the dataset of yuzuriha_jigokuraku, containing 97 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
andersonbcdefg/sharegpt_reward_modeling_pairwise_no_as_an_ai | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response_a
dtype: string
- name: response_b
dtype: string
- name: explanation
dtype: string
- name: preferred
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 21367130
num_examples: 11841
download_size: 11592587
dataset_size: 21367130
---
# Dataset Card for "sharegpt_reward_modeling_pairwise_no_as_an_ai"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
danielleward/288-demo | ---
license: pddl
---
|
meninohackerhomem/GERALDO | ---
license: openrail
---
|
Vinotha/uaspeechall | ---
license: mit
dataset_info:
features:
- name: audio
dtype: audio
- name: speaker_id
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 8926602991.68
num_examples: 66280
- name: test
num_bytes: 3367004882.0
num_examples: 25000
download_size: 8826509203
dataset_size: 12293607873.68
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
rusano/ELI5_custom_encoded | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: decoder_attention_mask
sequence: int64
splits:
- name: train
num_bytes: 1309686912
num_examples: 196296
- name: test
num_bytes: 10054704
num_examples: 1507
- name: val
num_bytes: 327421728
num_examples: 49074
download_size: 151484595
dataset_size: 1647163344
---
# Dataset Card for "ELI5_custom_encoded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dialbird/mental_health_chatbot_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 189421
num_examples: 172
download_size: 102271
dataset_size: 189421
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mental_health_chatbot_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dora-rs/dora-robomaster | ---
configs:
- config_name: image
data_files:
- split: train
path: graphs/out/*/image.parquet
- config_name: mistral
data_files:
- split: train
path: graphs/out/*/mistral_output_file.parquet
- config_name: chatgpt
data_files:
- split: train
path: graphs/out/*/chatgpt_output_file.parquet
- config_name: raw_file
data_files:
- split: train
path: graphs/out/*/raw_file.parquet
- config_name: saved_file
data_files:
- split: train
path: graphs/out/*/saved_file.parquet
- config_name: audio
data_files:
- split: train
path: graphs/out/*/audio.parquet
- config_name: whisper_text
data_files:
- split: train
path: graphs/out/*/whisper_text.parquet
- config_name: control
data_files:
- split: train
path: graphs/out/*/control.parquet
- config_name: gimbal_control
data_files:
- split: train
path: graphs/out/*/gimbal_control.parquet
- config_name: logs
data_files:
- split: train
path: graphs/out/*.txt
license: apache-2.0
language:
- en
tags:
- dora
- robotic
---
# Dora-Robomaster
This project aims to use Dora to enhance the capabilities of a RoboMaster S1.
You can see a quick demo here:
[](http://www.youtube.com/watch?v=NvvTEP8Jak8)
### Getting Started
command to start the demo:
```bash
alias dora='dora-cli'
dora up
dora start graphs/dataflow.yml --attach
```
start the reaction lighting test:
`dora start graphs/reaction.yml --attach`
## Installation of the Robomaster S1 Hack
This guide is an updated version of the original [Robomaster S1 SDK Hack Guide](https://www.bug-br.org.br/s1_sdk_hack.zip) and is intended for use on a Windows 11 system.
### Prerequisites
Before you get started, you'll need the following:
- Robomaster S1 (do not update it to the latest version, as it may block the hack).
- [Robomaster App](https://www.dji.com/fr/robomaster-s1/downloads).
- [Android SDK Platform-Tools](https://developer.android.com/tools/releases/platform-tools). Simply unzip it and keep the path handy.
- A micro USB cable. If this guide doesn't work, there might be an issue with the cable, and you may need to replace it with one that supports data transfer.
### Instructions
1. Start the Robomaster App and connect the Robomaster S1 using one of the two options provided (via router or via Wi-Fi).
2. While connected, use a micro USB cable to connect the robot to the computer's USB port. You should hear a beep sound, similar to when you connect any device. (Please note that no other Android device should be connected via USB during this process).
3. In the Lab section of the app, create a new Python application and paste the following code:
```python
def root_me(module):
__import__ = rm_define.__dict__['__builtins__']['__import__']
return __import__(module, globals(), locals(), [], 0)
builtins = root_me('builtins')
subprocess = root_me('subprocess')
proc = subprocess.Popen('/system/bin/adb_en.sh', shell=True, executable='/system/bin/sh', stdout=subprocess.PIPE, stderr=subprocess.PIPE)
```
4. Run the code; there should be no errors, and the console should display **Execution Complete**
5. Without closing the app, navigate to the folder containing the Android SDK Platform-Tools and open a terminal inside it.
6. Run the ADP command `.\adb.exe devices `. If everything is working correctly, you should see output similar to this: 
7. Execute the upload.sh script located in the folder `s1_SDK`.
8. Once everything has been executed, restart the S1 by turning it off and then back on. While it's booting up, you should hear two chimes instead of the usual single chime, indicating that the hack has been successful.
## HuggingFace Dataset
To set up this repo as a dataset repository:
```bash
git lfs install
git clone https://huggingface.co/datasets/haixuantao/dora-robomaster
# if you want to clone without large files – just their pointers
# prepend your git clone with the following env var:
GIT_LFS_SKIP_SMUDGE=1
```
To use the dataset:
```python
from datasets import load_dataset
dataset = load_dataset("haixuantao/dora-robomaster")
```
|
joey234/mmlu-professional_medicine | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 9913
num_examples: 5
- name: test
num_bytes: 1180543
num_examples: 272
download_size: 295748
dataset_size: 1190456
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-professional_medicine"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dipteshkanojia/t5-qe-2023-enhi-da-test | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: task
dtype: string
splits:
- name: train
num_bytes: 905309
num_examples: 1074
download_size: 303099
dataset_size: 905309
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
- hi
---
# Dataset Card for "t5-qe-2023-enhi-da-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aryaman/causalgym | ---
license: mit
language:
- en
tags:
- interpretability
- linguistics
pretty_name: CausalGym
size_categories:
- 10K<n<100K
---
**CausalGym** is a benchmark for comparing the performance of causal interpretability methods
on a variety of simple linguistic tasks taken from the SyntaxGym evaluation set
([Gauthier et al., 2020](https://aclanthology.org/2020.acl-demos.10/), [Hu et al., 2020](https://aclanthology.org/2020.acl-main.158/))
and converted into a format suitable for interventional interpretability.
The dataset includes train/dev/test splits (exactly as used in the experiments in the paper).
The `base`/`src` columns are the prompts on which intervention is done. Each of these is a list of strings,
with each string being a span in the template which is aligned by index and may have an unequal number
of tokens. The `base_label` and `src_label` columns are the ground truth next-token predictions that we
train/evaluate on, and the `base_type` and `src_type` columns indicate the class (always binary) of the prompts.
Finally, the `task` column indicates which task this row is from. You should train separately on each task since
each one studies a different linguistic feature.
## Citation
If using this dataset, please cite the CausalGym paper as well as the preceding SyntaxGym papers.
```bibtex
@article{arora-etal-2024-causalgym,
title = "{C}ausal{G}ym: Benchmarking causal interpretability methods on linguistic tasks",
author = "Arora, Aryaman and Jurafsky, Dan and Potts, Christopher",
journal = "arXiv:2402.12560",
year = "2024",
url = "https://arxiv.org/abs/2402.12560"
}
@inproceedings{gauthier-etal-2020-syntaxgym,
title = "{S}yntax{G}ym: An Online Platform for Targeted Evaluation of Language Models",
author = "Gauthier, Jon and Hu, Jennifer and Wilcox, Ethan and Qian, Peng and Levy, Roger",
editor = "Celikyilmaz, Asli and Wen, Tsung-Hsien",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.acl-demos.10",
doi = "10.18653/v1/2020.acl-demos.10",
pages = "70--76",
}
@inproceedings{hu-etal-2020-systematic,
title = "A Systematic Assessment of Syntactic Generalization in Neural Language Models",
author = "Hu, Jennifer and Gauthier, Jon and Qian, Peng and Wilcox, Ethan and Levy, Roger",
editor = "Jurafsky, Dan and Chai, Joyce and Schluter, Natalie and Tetreault, Joel",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.acl-main.158",
doi = "10.18653/v1/2020.acl-main.158",
pages = "1725--1744",
}
``` |
TheGreatRambler/mm2_world_levels | ---
language:
- multilingual
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- other
- object-detection
- text-retrieval
- token-classification
- text-generation
task_ids: []
pretty_name: Mario Maker 2 super world levels
tags:
- text-mining
---
# Mario Maker 2 super world levels
Part of the [Mario Maker 2 Dataset Collection](https://tgrcode.com/posts/mario_maker_2_datasets)
## Dataset Description
The Mario Maker 2 super world levels dataset consists of 3.3 million super world levels from Nintendo's online service and adds onto `TheGreatRambler/mm2_world`. The dataset was created using the self-hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api) over the course of 1 month in February 2022.
### How to use it
You can load and iterate through the dataset with the following code:
```python
from datasets import load_dataset
ds = load_dataset("TheGreatRambler/mm2_world_levels", split="train")
print(next(iter(ds)))
#OUTPUT:
{
'pid': '14510618610706594411',
'data_id': 19170881,
'ninjis': 23
}
```
Each row is a level within a super world owned by player `pid` that is denoted by `data_id`. Each level contains some number of ninjis `ninjis`, a rough metric for their popularity.
## Data Structure
### Data Instances
```python
{
'pid': '14510618610706594411',
'data_id': 19170881,
'ninjis': 23
}
```
### Data Fields
|Field|Type|Description|
|---|---|---|
|pid|string|The player ID of the user who created the super world with this level|
|data_id|int|The data ID of the level|
|ninjis|int|Number of ninjis shown on this level|
### Data Splits
The dataset only contains a train split.
<!-- TODO create detailed statistics -->
## Dataset Creation
The dataset was created over a little more than a month in Febuary 2022 using the self hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api). As requests made to Nintendo's servers require authentication the process had to be done with upmost care and limiting download speed as to not overload the API and risk a ban. There are no intentions to create an updated release of this dataset.
## Considerations for Using the Data
The dataset contains no harmful language or depictions.
|
mattyhatch/tomatoesSpoof3 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 37099461.0
num_examples: 557
download_size: 33524817
dataset_size: 37099461.0
---
# Dataset Card for "tomatoesSpoof3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/FGVC_Aircraft_test_facebook_opt_2.7b_Visclues_ns_3333_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 300686660.375
num_examples: 3333
- name: fewshot_3_bs_16
num_bytes: 302943871.375
num_examples: 3333
download_size: 595742511
dataset_size: 603630531.75
---
# Dataset Card for "FGVC_Aircraft_test_facebook_opt_2.7b_Visclues_ns_3333_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712959085 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3758
num_examples: 8
download_size: 7521
dataset_size: 3758
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712959085"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Samvardhan777/opus-100-German-to-English | ---
dataset_info:
features:
- name: formatted_text
dtype: string
splits:
- name: train
num_bytes: 189245956
num_examples: 1000000
download_size: 113697955
dataset_size: 189245956
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
siacus/tweets | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3897153
num_examples: 2404
download_size: 640183
dataset_size: 3897153
---
# Dataset Card for "tweets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DL3DV/DL3DV-ALL-960P | ---
tags:
- 3D Vision
- NeRF
- 3D Gaussian
- Dataset
- Novel View Synthesis
- Text to 3D
- Image to 3D
pretty_name: Dl3DV-Dataset
size_categories:
- n>1T
---
# DL3DV-Dataset
This repo has all the 960P frames with camera poses of DL3DV-10K Dataset. We are working hard to review all the dataset to avoid sensitive information. Thank you for your patience.
# Download
If you have enough space, you can use git to download a dataset from huggingface. See this [link](https://huggingface.co/docs/hub/en/datasets-downloading). [480P](https://huggingface.co/datasets/DL3DV/DL3DV-ALL-480P)/[960P](https://huggingface.co/datasets/DL3DV/DL3DV-ALL-960P) versions should satisfies most needs.
If you do not have enough space, we further provide a [download script](https://github.com/DL3DV-10K/Dataset/blob/main/scripts/download.py) here to download a subset. The usage:
```Bash
usage: download.py [-h] --odir ODIR --subset {1K,2K,3K,4K,5K,6K,7K,8K,9K,10K} --resolution {4K,2K,960P,480P} --file_type {images+poses,video,colmap_cache} [--hash HASH]
[--clean_cache]
optional arguments:
-h, --help show this help message and exit
--odir ODIR output directory
--subset {1K,2K,3K,4K,5K,6K,7K,8K,9K,10K}
The subset of the benchmark to download
--resolution {4K,2K,960P,480P}
The resolution to donwnload
--file_type {images+poses,video,colmap_cache}
The file type to download
--hash HASH If set subset=hash, this is the hash code of the scene to download
--clean_cache If set, will clean the huggingface cache to save space
```
Here are some examples:
```Bash
# Make sure you have applied for the access.
# Use this to download the download.py script
wget https://raw.githubusercontent.com/DL3DV-10K/Dataset/main/scripts/download.py
# Download 960P resolution images and poses, 0~1K subset, output to DL3DV-10K directory
python download.py --odir DL3DV-10K --subset 1K --resolution 960P --file_type images+poses --clean_cache
# Download 960P resolution images and poses, 1K~2K subset, output to DL3DV-10K directory
python download.py --odir DL3DV-10K --subset 2K --resolution 960P --file_type images+poses --clean_cache
```
You can also download a specific scene with its hash. The scene-hash pair visualization can be found [here](https://htmlpreview.github.io/?https://github.com/DL3DV-10K/Dataset/blob/main/visualize/index.html).
```Bash
python download.py --odir DL3DV-10K --subset 2K --resolution 960P --file_type images+poses --hash e2cedefea8a0ed2d0ffbd5bdc08acbe7e1f85c96f72f7b790e9dfe1c98963047 --clean_cache
```
# News
- [x] DL3DV-1K, 2K, 3K, 4K
- [ ] DL3DV-5K ~ 10K
|
thobauma/harmless-poisoned-0.05-symbols-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
elisachen/example_dataset | ---
license: bsd
---
|
samuelsze/bev_da_d_pedx_walkway_carpark | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 544986388.919
num_examples: 40157
download_size: 307989767
dataset_size: 544986388.919
---
# Dataset Card for "bev_da_d_pedx_walkway_carpark"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fewefWEGwg/sentiment_analysis_dataset | ---
license: mit
---
|
Multimodal-Fatima/FGVC_Aircraft_test_facebook_opt_6.7b_Attributes_Caption_ns_3333_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 300148506.375
num_examples: 3333
- name: fewshot_3_bs_16
num_bytes: 301866097.375
num_examples: 3333
download_size: 590830197
dataset_size: 602014603.75
---
# Dataset Card for "FGVC_Aircraft_test_facebook_opt_6.7b_Attributes_Caption_ns_3333_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zbulrush/lineart | ---
license: openrail
---
|
BEE-spoke-data/sbert-paraphrase-data | ---
language:
- en
license: odc-by
size_categories:
- 100M<n<1B
task_categories:
- sentence-similarity
dataset_info:
- config_name: default
features:
- name: '0'
dtype: string
- name: '1'
dtype: string
splits:
- name: train
num_bytes: 23655222164
num_examples: 142947230
download_size: 15494823340
dataset_size: 23655222164
- config_name: msmarco-triplets-flat
features:
- name: text
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
splits:
- name: train
num_bytes: 358771844
num_examples: 485469
download_size: 233344152
dataset_size: 358771844
- config_name: pairs-100word
features:
- name: '0'
dtype: string
- name: '1'
dtype: string
splits:
- name: train
num_bytes: 2317278084
num_examples: 1611483
download_size: 1332475321
dataset_size: 2317278084
- config_name: triplets
features:
- name: text
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
splits:
- name: train
num_bytes: 222068225
num_examples: 1064993
download_size: 106956648
dataset_size: 222068225
- config_name: triplets-expanded
features:
- name: text
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
splits:
- name: train
num_bytes: 1028568107
num_examples: 1660962
download_size: 693685496
dataset_size: 1028568107
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: msmarco-triplets-flat
data_files:
- split: train
path: msmarco-triplets-flat/train-*
- config_name: pairs-100word
data_files:
- split: train
path: pairs-100word/train-*
- config_name: triplets
data_files:
- split: train
path: triplets/train-*
- config_name: triplets-expanded
data_files:
- split: train
path: triplets-expanded/train-*
---
# BEE-spoke-data/sbert-paraphrase-data
Paraphrase data from [sentence-transformers](https://www.sbert.net/examples/training/paraphrases/README.html#datasets)
## contents
### default
| No. | Filename |
|-----|--------------------------------------------------------------|
| 1 | yahoo_answers_title_question.jsonl |
| 2 | squad_pairs.jsonl |
| 3 | eli5_question_answer.jsonl |
| 4 | WikiAnswers_pairs.jsonl |
| 5 | stackexchange_duplicate_questions_title_title.jsonl |
| 6 | TriviaQA_pairs.jsonl |
| 7 | stackexchange_duplicate_questions.jsonl |
| 8 | sentence-compression.jsonl |
| 9 | AllNLI_2cols.jsonl |
| 10 | NQ-train_pairs.jsonl |
| 11 | searchQA_question_top5_snippets_merged.jsonl |
| 12 | stackexchange_duplicate_questions_title-body_title-body.jsonl|
| 13 | SimpleWiki.jsonl |
| 14 | yahoo_answers_question_answer.jsonl |
| 15 | gooaq_pairs.jsonl |
| 16 | quora_duplicates.jsonl |
| 17 | stackexchange_duplicate_questions_body_body.jsonl |
| 18 | yahoo_answers_title_answer.jsonl |
| 19 | S2ORC_citation_pairs.jsonl |
| 20 | stackexchange_title_body_small.jsonl |
| 21 | fever_train.jsonl |
| 22 | altlex.jsonl |
| 23 | amazon-qa-train-pairs.jsonl |
| 24 | codesearchnet.jsonl |
| 25 | searchQA_question_topSnippet.jsonl |
### triplets
| No. | Filename |
|-----|--------------------------------------|
| 1 | AllNLI.jsonl |
| 2 | specter_train_triples.jsonl |
| 3 | quora_duplicates_triplets.jsonl |
|
adamo1139/toxic-dpo-natural-v5 | ---
license: other
license_name: other
license_link: LICENSE
---
I mixed in toxid-dpo-natural-v4 and rawrr v2-1 stage 2 with chosen field from original no_robots and got myself toxic-dpo-natural-v5. Goal is to avoid overfitting via DPO to a specific type of instruct, and instead just DPO the model to be more open to answering and also answer like a human being. We'll see whether this works. |
dspoka/sdg-single | ---
dataset_info:
features:
- name: iso3
dtype: string
- name: country
dtype: string
- name: goal
dtype: string
- name: target
dtype: string
- name: text
dtype: string
- name: status
dtype: string
- name: sector
dtype: string
- name: response
dtype: string
- name: infotype
dtype: string
- name: start
dtype: float64
- name: end
dtype: float64
- name: filename
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: full
num_bytes: 4297968
num_examples: 14219
download_size: 0
dataset_size: 4297968
---
# Dataset Card for "sdg-single"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zarakun/youtube_ua_subtitles_test | ---
task_categories:
- automatic-speech-recognition
pretty_name: MangoSpeech
configs:
- config_name: rozdympodcast
data_files: "data/rozdympodcast.parquet"
- config_name: opodcast
data_files: "data/opodcast.parquet"
- config_name: test
data_files: "data/test.parquet"
---
# The list of all subsets in the dataset
Each subset is generated splitting videos from given particular ukrainiam YouTube channel
All subsets are in test split
- "opodcast" subset is from channel "О! ПОДКАСТ"
- "rozdympodcast" subset is from channel "Роздум | Подкаст"
- "test" subset is just a small subset of samples
# Loading a particular subset
```
>>> data_files = {"train": "data/<your_subset>.parquet"}
>>> data = load_dataset("Zarakun/youtube_ua_subtitles_test", data_files=data_files)
>>> data
DatasetDict({
train: Dataset({
features: ['audio', 'rate', 'duration', 'sentence'],
num_rows: <some_number>
})
})
``` |
DBQ/Net.a.Porter.Product.prices.Italy | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Italy - Net-a-Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Net
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Net-a-Porter
dtype: string
- name: '2023-11-08'
dtype: string
- name: ITA
dtype: string
- name: EUR
dtype: string
- name: SAINT LAURENT
dtype: string
- name: CLOTHING
dtype: string
- name: DRESSES
dtype: string
- name: MIDI DRESSES
dtype: string
- name: '1647597276844592'
dtype: int64
- name: Lace-trimmed silk-satin midi dress
dtype: string
- name: https://www.net-a-porter.com/it/en/shop/product/saint-laurent/clothing/midi-dresses/lace-trimmed-silk-satin-midi-dress/1647597276844592
dtype: string
- name: https://www.net-a-porter.com/variants/images/1647597276844592/ou/w1000.jpg
dtype: string
- name: '3490.00'
dtype: float64
- name: 3490.00.1
dtype: float64
- name: 3490.00.2
dtype: float64
- name: 3490.00.3
dtype: float64
- name: '0'
dtype: int64
splits:
- name: train
num_bytes: 17591923
num_examples: 43148
download_size: 5140785
dataset_size: 17591923
---
# Net-a-Porter web scraped data
## About the website
The **Ecommerce** industry in the EMEA region, particularly in **Italy**, has seen significant growth due to digital transformation and increased web shopping habits. Within this sector, the luxury fashion industry is a standout, where **Net-a-Porter** operates. This platform effectively fuses traditional haute couture and digital luxury shopping, offering an extensive range of high-end clothing and accessories. The observed dataset contains **Ecommerce product-list page (PLP)** data on Net-a-Porter in Italy, providing insight on user behavior, product attractiveness and market trends. This type of data enables customization and effectiveness of digital marketing strategies for the brand.
## Link to **dataset**
[Italy - Net-a-Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Net-a-Porter%20Product-prices%20Italy/r/recAG83il5B3oEP18)
|
AllyArc/allyarc_oai_format | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
pretty_name: AllyArc OpenAI Dataset Format
size_categories:
- 1K<n<10K
---
# Dataset Card for AllyArc/allyarc_oai_format
This dataset card provides a structured overview of the AllyArc/allyarc_oai_format dataset, designed for training conversational AI models tailored for educational purposes, with a special focus on supporting students with diverse learning needs, including those in Special Educational Needs (SEN) education.
## Dataset Details
### Dataset Description
The AllyArc/allyarc_oai_format dataset is comprised of conversational exchanges formatted to support the training of AI models for educational dialogues. It includes interactions that cover a wide range of educational support tasks, such as providing detailed explanations (breakdowns), adapting to various learning styles, incorporating student interests into lessons, and managing classroom dynamics tailored to SEN education.
- **Curated by:** Zainab Fahim
- **Language(s) (NLP):** English
- **License:** MIT License
### Dataset Sources
- **Repository:** Hugging Face Datasets
## Uses
### Direct Use
The dataset is intended for direct use in training conversational AI models to:
- Understand and respond to educational queries.
- Personalize interactions based on student needs and learning styles.
- Provide breakdowns of complex educational content.
- Engage students with tailored educational strategies.
### Out-of-Scope Use
This dataset is not intended for uses beyond educational support. Specifically, it should not be used for:
- Commercial advertising.
- Non-educational chatbot training.
- Any form of decision-making that could negatively impact students' wellbeing.
## Dataset Structure
The dataset is structured into dialogues, each containing multiple turns with roles (`system`, `user`, `assistant`) indicating the speaker. It includes fields for dialogue ID, turns, education level, subject matter, and feedback mechanisms, facilitating comprehensive educational dialogues.
## Dataset Creation
### Curation Rationale
The dataset was curated to address the nuanced needs of SEN education, focusing on creating a supportive, interactive, and adaptive learning environment through AI-driven dialogues.
### Source Data
#### Data Collection and Processing
Data collection involved simulating educational dialogues that reflect typical interactions between students and an educational AI. The process emphasized personalization, adaptability, and inclusivity, considering the diverse needs of SEN students.
#### Who are the source data producers?
The data was produced by educational specialists, SEN teachers, and AI developers, with input from SEN students to ensure authenticity and relevance.
### Annotations
#### Annotation process
The dialogues were annotated with educational intent, subject matter tags, and personalized learning strategies to facilitate model training on educational tasks.
#### Who are the annotators?
Educational specialists and SEN teachers annotated the dataset, ensuring that the dialogues accurately reflect educational best practices and SEN considerations.
## Bias, Risks, and Limitations
The dataset aims to minimize bias by including diverse educational needs and learning styles. However, users should be aware of the limitations in scope and ensure models trained on this dataset are used ethically and considerately in educational contexts.
## Citation
**APA:**
AllyArc Educational Team. (2023). AllyArc/allyarc_oai_format Dataset. Hugging Face. URL
**BibTeX:**
```bibtex
@misc{allyarc2023dataset,
title={AllyArc/allyarc_oai_format Dataset},
author={AllyArc Educational Team},
year={2023},
publisher={Hugging Face},
howpublished={\url{}},
}
```
## Dataset Card Authors
Zainab Fahim
## Dataset Card Contact
For inquiries related to the AllyArc/allyarc_oai_format dataset, please contact: [Zainab Fahim](mailto:shafna.zainab.fahim@gmail.com) |
cledoux42/autotrain-data-ethnicity-test_v003 | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: ethnicity-test_v003
## Dataset Description
This dataset has been automatically processed by AutoTrain for project ethnicity-test_v003.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<512x512 RGB PIL image>",
"target": 1
},
{
"image": "<512x512 RGB PIL image>",
"target": 3
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['african', 'asian', 'caucasian', 'hispanic', 'indian'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 4531 |
| valid | 1135 |
|
HuggingSara/usmle_self_assessment | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: E
dtype: string
- name: F
dtype: string
- name: G
dtype: string
- name: H
dtype: string
- name: I
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: answer_idx
dtype: string
splits:
- name: train
num_bytes: 372032
num_examples: 325
download_size: 213238
dataset_size: 372032
---
# Dataset Card for "usmle_self_assesment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v3](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-01T14:01:58.848407](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3/blob/main/results_2023-09-01T14%3A01%3A58.848407.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6813782482106774,\n\
\ \"acc_stderr\": 0.03171011741691581,\n \"acc_norm\": 0.6847848607826429,\n\
\ \"acc_norm_stderr\": 0.031684498624315015,\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6421820394674438,\n\
\ \"mc2_stderr\": 0.015085186356964665\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283504,\n\
\ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.013572657703084948\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6936865166301533,\n\
\ \"acc_stderr\": 0.004600194559865542,\n \"acc_norm\": 0.8716391157140012,\n\
\ \"acc_norm_stderr\": 0.003338076015617253\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.02619980880756192,\n\
\ \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.02619980880756192\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n\
\ \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n\
\ \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6340425531914894,\n\
\ \"acc_stderr\": 0.0314895582974553,\n \"acc_norm\": 0.6340425531914894,\n\
\ \"acc_norm_stderr\": 0.0314895582974553\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04644602091222318,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04644602091222318\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"\
acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48412698412698413,\n \"acc_stderr\": 0.025738330639412152,\n \"\
acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.025738330639412152\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8225806451612904,\n \"acc_stderr\": 0.021732540689329286,\n \"\
acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.021732540689329286\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603918,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603918\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.02340092891831049,\n \
\ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.02340092891831049\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652459,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652459\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279476,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279476\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8807339449541285,\n \"acc_stderr\": 0.013895729292588949,\n \"\
acc_norm\": 0.8807339449541285,\n \"acc_norm_stderr\": 0.013895729292588949\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969426998,\n \"\
acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969426998\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.02034340073486884,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.02034340073486884\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383602,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383602\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n\
\ \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990915,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990915\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8607918263090677,\n\
\ \"acc_stderr\": 0.012378786101885145,\n \"acc_norm\": 0.8607918263090677,\n\
\ \"acc_norm_stderr\": 0.012378786101885145\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5787709497206703,\n\
\ \"acc_stderr\": 0.016513676031179595,\n \"acc_norm\": 0.5787709497206703,\n\
\ \"acc_norm_stderr\": 0.016513676031179595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.752411575562701,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.752411575562701,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7993827160493827,\n \"acc_stderr\": 0.02228231394977488,\n\
\ \"acc_norm\": 0.7993827160493827,\n \"acc_norm_stderr\": 0.02228231394977488\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5709219858156028,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.5709219858156028,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5645371577574967,\n\
\ \"acc_stderr\": 0.012663412101248349,\n \"acc_norm\": 0.5645371577574967,\n\
\ \"acc_norm_stderr\": 0.012663412101248349\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7336601307189542,\n \"acc_stderr\": 0.017883188134667206,\n \
\ \"acc_norm\": 0.7336601307189542,\n \"acc_norm_stderr\": 0.017883188134667206\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900794,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900794\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6421820394674438,\n\
\ \"mc2_stderr\": 0.015085186356964665\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|arc:challenge|25_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hellaswag|10_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T14:01:58.848407.parquet'
- config_name: results
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- results_2023-09-01T14:01:58.848407.parquet
- split: latest
path:
- results_2023-09-01T14:01:58.848407.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v3](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-01T14:01:58.848407](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3/blob/main/results_2023-09-01T14%3A01%3A58.848407.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6813782482106774,
"acc_stderr": 0.03171011741691581,
"acc_norm": 0.6847848607826429,
"acc_norm_stderr": 0.031684498624315015,
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6421820394674438,
"mc2_stderr": 0.015085186356964665
},
"harness|arc:challenge|25": {
"acc": 0.6621160409556314,
"acc_stderr": 0.013822047922283504,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.013572657703084948
},
"harness|hellaswag|10": {
"acc": 0.6936865166301533,
"acc_stderr": 0.004600194559865542,
"acc_norm": 0.8716391157140012,
"acc_norm_stderr": 0.003338076015617253
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.02619980880756192,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.02619980880756192
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6340425531914894,
"acc_stderr": 0.0314895582974553,
"acc_norm": 0.6340425531914894,
"acc_norm_stderr": 0.0314895582974553
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04644602091222318,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04644602091222318
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.025738330639412152,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.025738330639412152
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603918,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.02340092891831049,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.02340092891831049
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652459,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652459
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279476,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279476
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.013895729292588949,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.013895729292588949
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969426998,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969426998
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.02034340073486884,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.02034340073486884
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383602,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383602
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990915,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990915
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8607918263090677,
"acc_stderr": 0.012378786101885145,
"acc_norm": 0.8607918263090677,
"acc_norm_stderr": 0.012378786101885145
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5787709497206703,
"acc_stderr": 0.016513676031179595,
"acc_norm": 0.5787709497206703,
"acc_norm_stderr": 0.016513676031179595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.752411575562701,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.752411575562701,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7993827160493827,
"acc_stderr": 0.02228231394977488,
"acc_norm": 0.7993827160493827,
"acc_norm_stderr": 0.02228231394977488
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5709219858156028,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.5709219858156028,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5645371577574967,
"acc_stderr": 0.012663412101248349,
"acc_norm": 0.5645371577574967,
"acc_norm_stderr": 0.012663412101248349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7336601307189542,
"acc_stderr": 0.017883188134667206,
"acc_norm": 0.7336601307189542,
"acc_norm_stderr": 0.017883188134667206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900794,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900794
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6421820394674438,
"mc2_stderr": 0.015085186356964665
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dkennedy-USGS/satellite-pond-detection | ---
license: apache-2.0
---
|
Feanix/gtzan-10-sec | ---
pretty_name: GTZAN
task_categories:
- audio-classification
tags:
- music
size_categories:
- 1K<n<10K
---
# Dataset Card for GTZAN
## Table of Contents
- [Dataset Card for GTZAN](#dataset-card-for-gtzan)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://marsyas.info/downloads/datasets.html](http://marsyas.info/downloads/datasets.html)
- **Paper:** [http://ismir2001.ismir.net/pdf/tzanetakis.pdf](http://ismir2001.ismir.net/pdf/tzanetakis.pdf)
- **Point of Contact:**
### Dataset Summary
GTZAN is a dataset for musical genre classification of audio signals. The dataset consists of 1,000 audio tracks, each of 30 seconds long. It contains 10 genres, each represented by 100 tracks. The tracks are all 22,050Hz Mono 16-bit audio files in WAV format. The genres are: blues, classical, country, disco, hiphop, jazz, metal, pop, reggae, and rock.
*** THIS VERSION OF THE DATASET CONTAINS THE ORIGINAL AUDIO TRACKS SEGMENTED INTO 10 SECOND LONG FILES ***
### Languages
English
## Dataset Structure
GTZAN is distributed as a single dataset without a predefined training and test split. The information below refers to the single `train` split that is assigned by default.
### Data Instances
An example of GTZAN looks as follows:
```python
{
"file": "/path/to/cache/genres/blues/blues.00000.wav",
"audio": {
"path": "/path/to/cache/genres/blues/blues.00000.wav",
"array": array(
[
0.00732422,
0.01660156,
0.00762939,
...,
-0.05560303,
-0.06106567,
-0.06417847,
],
dtype=float32,
),
"sampling_rate": 22050,
},
"genre": 0,
}
```
### Data Fields
The types associated with each of the data fields is as follows:
* `file`: a `string` feature.
* `audio`: an `Audio` feature containing the `path` of the sound file, the decoded waveform in the `array` field, and the `sampling_rate`.
* `genre`: a `ClassLabel` feature.
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@misc{tzanetakis_essl_cook_2001,
author = "Tzanetakis, George and Essl, Georg and Cook, Perry",
title = "Automatic Musical Genre Classification Of Audio Signals",
url = "http://ismir2001.ismir.net/pdf/tzanetakis.pdf",
publisher = "The International Society for Music Information Retrieval",
year = "2001"
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun) for adding this dataset. |
presencesw/dataset4_translated_not_cleaned | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: references
sequence: string
- name: question_vi
dtype: string
- name: answer_vi
dtype: string
- name: references_vi
sequence: string
splits:
- name: train
num_bytes: 3347088.67502309
num_examples: 556
- name: validation
num_bytes: 501960.511
num_examples: 83
- name: test
num_bytes: 231468.64
num_examples: 38
download_size: 2152891
dataset_size: 4080517.82602309
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
scutcyr/SoulChatCorpus | ---
license: apache-2.0
---
|
erfanzar/Zeus-v0.1-Llama | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 643963135
num_examples: 386175
download_size: 340667895
dataset_size: 643963135
---
# Dataset Card for "Zerus-v0.1-Llama"
contains llama prompt-type from `erfanzar/Zeus-v0.1` |
ucalyptus/TheRanveerShow | ---
license: mit
---
|
adhikasp/hackernews | ---
license: unknown
---
|
feliciamj/processed_demo | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 9453
dataset_size: 2464
---
# Dataset Card for "processed_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ContextualAI__Contextual_KTO_Mistral_PairRM | ---
pretty_name: Evaluation run of ContextualAI/Contextual_KTO_Mistral_PairRM
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ContextualAI/Contextual_KTO_Mistral_PairRM](https://huggingface.co/ContextualAI/Contextual_KTO_Mistral_PairRM)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ContextualAI__Contextual_KTO_Mistral_PairRM\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T14:17:07.643549](https://huggingface.co/datasets/open-llm-leaderboard/details_ContextualAI__Contextual_KTO_Mistral_PairRM/blob/main/results_2024-03-07T14-17-07.643549.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6022732654514354,\n\
\ \"acc_stderr\": 0.03325322256191159,\n \"acc_norm\": 0.6078337877090195,\n\
\ \"acc_norm_stderr\": 0.03392992795919382,\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.7167420880007231,\n\
\ \"mc2_stderr\": 0.014911134722290867\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.014291228393536588,\n\
\ \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.013960142600598673\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.681736705835491,\n\
\ \"acc_stderr\": 0.004648503177353963,\n \"acc_norm\": 0.8552081258713403,\n\
\ \"acc_norm_stderr\": 0.0035117170854519846\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949646,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949646\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752056,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752056\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176085,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176085\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.03086868260412163,\n \
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.03086868260412163\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630797,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630797\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688235,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688235\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862744,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862744\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.012620785155885998,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.012620785155885998\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159696,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159696\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712845,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712845\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357305,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357305\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.7167420880007231,\n\
\ \"mc2_stderr\": 0.014911134722290867\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33813495072024263,\n \
\ \"acc_stderr\": 0.013030829145172198\n }\n}\n```"
repo_url: https://huggingface.co/ContextualAI/Contextual_KTO_Mistral_PairRM
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|arc:challenge|25_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|gsm8k|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hellaswag|10_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-17-07.643549.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T14-17-07.643549.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- '**/details_harness|winogrande|5_2024-03-07T14-17-07.643549.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T14-17-07.643549.parquet'
- config_name: results
data_files:
- split: 2024_03_07T14_17_07.643549
path:
- results_2024-03-07T14-17-07.643549.parquet
- split: latest
path:
- results_2024-03-07T14-17-07.643549.parquet
---
# Dataset Card for Evaluation run of ContextualAI/Contextual_KTO_Mistral_PairRM
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ContextualAI/Contextual_KTO_Mistral_PairRM](https://huggingface.co/ContextualAI/Contextual_KTO_Mistral_PairRM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ContextualAI__Contextual_KTO_Mistral_PairRM",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T14:17:07.643549](https://huggingface.co/datasets/open-llm-leaderboard/details_ContextualAI__Contextual_KTO_Mistral_PairRM/blob/main/results_2024-03-07T14-17-07.643549.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6022732654514354,
"acc_stderr": 0.03325322256191159,
"acc_norm": 0.6078337877090195,
"acc_norm_stderr": 0.03392992795919382,
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.7167420880007231,
"mc2_stderr": 0.014911134722290867
},
"harness|arc:challenge|25": {
"acc": 0.6040955631399317,
"acc_stderr": 0.014291228393536588,
"acc_norm": 0.6476109215017065,
"acc_norm_stderr": 0.013960142600598673
},
"harness|hellaswag|10": {
"acc": 0.681736705835491,
"acc_stderr": 0.004648503177353963,
"acc_norm": 0.8552081258713403,
"acc_norm_stderr": 0.0035117170854519846
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949646,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949646
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752056,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752056
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176085,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176085
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.03086868260412163,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.03086868260412163
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630797,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630797
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688235,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688235
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397112,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388852,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388852
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.025976566010862744,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.025976566010862744
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885998,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885998
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.029624663581159696,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.029624663581159696
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.01965992249362335,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.01965992249362335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712845,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712845
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357305,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357305
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.7167420880007231,
"mc2_stderr": 0.014911134722290867
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
},
"harness|gsm8k|5": {
"acc": 0.33813495072024263,
"acc_stderr": 0.013030829145172198
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
rlacombe/ClimateNet | ---
license: mit
---
|
nath720/stableDiff | ---
license: openrail
---
|
HanxuHU/mmmu_vi | ---
dataset_info:
- config_name: Accounting
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1600513.0
num_examples: 30
download_size: 1536313
dataset_size: 1600513.0
- config_name: Agriculture
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 119218416.0
num_examples: 30
download_size: 119223478
dataset_size: 119218416.0
- config_name: Architecture_and_Engineering
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 723286.0
num_examples: 30
download_size: 728058
dataset_size: 723286.0
- config_name: Art
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 29935691.0
num_examples: 30
download_size: 29943292
dataset_size: 29935691.0
- config_name: Art_Theory
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 33482039.0
num_examples: 30
download_size: 29784175
dataset_size: 33482039.0
- config_name: Basic_Medical_Science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 4126498.0
num_examples: 30
download_size: 4131705
dataset_size: 4126498.0
- config_name: Biology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 8493236.0
num_examples: 30
download_size: 8494507
dataset_size: 8493236.0
- config_name: Chemistry
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1519571.0
num_examples: 30
download_size: 1525498
dataset_size: 1519571.0
- config_name: Clinical_Medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 10884074.0
num_examples: 30
download_size: 10887082
dataset_size: 10884074.0
- config_name: Computer_Science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 2073612.0
num_examples: 30
download_size: 2078787
dataset_size: 2073612.0
- config_name: Design
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 17923831.0
num_examples: 30
download_size: 16227936
dataset_size: 17923831.0
- config_name: Diagnostics_and_Laboratory_Medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 37106915.0
num_examples: 30
download_size: 37090147
dataset_size: 37106915.0
- config_name: Economics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1488672.0
num_examples: 30
download_size: 1425996
dataset_size: 1488672.0
- config_name: Electronics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 641906.0
num_examples: 30
download_size: 645500
dataset_size: 641906.0
- config_name: Energy_and_Power
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1644195.0
num_examples: 30
download_size: 1647882
dataset_size: 1644195.0
- config_name: Finance
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1073960.0
num_examples: 30
download_size: 1004423
dataset_size: 1073960.0
- config_name: Geography
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 6672127.0
num_examples: 30
download_size: 6676981
dataset_size: 6672127.0
- config_name: History
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 8821056.0
num_examples: 30
download_size: 8431046
dataset_size: 8821056.0
- config_name: Literature
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 14242455.0
num_examples: 30
download_size: 14246949
dataset_size: 14242455.0
- config_name: Manage
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 3282286.0
num_examples: 30
download_size: 3141826
dataset_size: 3282286.0
- config_name: Marketing
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1475087.0
num_examples: 30
download_size: 1362121
dataset_size: 1475087.0
- config_name: Materials
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 2306413.0
num_examples: 30
download_size: 2310610
dataset_size: 2306413.0
- config_name: Math
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1445538.0
num_examples: 30
download_size: 1449131
dataset_size: 1445538.0
- config_name: Mechanical_Engineering
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 876287.0
num_examples: 30
download_size: 877662
dataset_size: 876287.0
- config_name: Music
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 9359678.0
num_examples: 30
download_size: 9363856
dataset_size: 9359678.0
- config_name: Pharmacy
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1657519.0
num_examples: 30
download_size: 1551833
dataset_size: 1657519.0
- config_name: Physics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1115721.0
num_examples: 30
download_size: 1117816
dataset_size: 1115721.0
- config_name: Psychology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 4412067.0
num_examples: 30
download_size: 4315496
dataset_size: 4412067.0
- config_name: Public_Health
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1512003.0
num_examples: 30
download_size: 1511863
dataset_size: 1512003.0
- config_name: Sociology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 18456100.0
num_examples: 30
download_size: 18459968
dataset_size: 18456100.0
configs:
- config_name: Accounting
data_files:
- split: validation
path: Accounting/validation-*
- config_name: Agriculture
data_files:
- split: validation
path: Agriculture/validation-*
- config_name: Architecture_and_Engineering
data_files:
- split: validation
path: Architecture_and_Engineering/validation-*
- config_name: Art
data_files:
- split: validation
path: Art/validation-*
- config_name: Art_Theory
data_files:
- split: validation
path: Art_Theory/validation-*
- config_name: Basic_Medical_Science
data_files:
- split: validation
path: Basic_Medical_Science/validation-*
- config_name: Biology
data_files:
- split: validation
path: Biology/validation-*
- config_name: Chemistry
data_files:
- split: validation
path: Chemistry/validation-*
- config_name: Clinical_Medicine
data_files:
- split: validation
path: Clinical_Medicine/validation-*
- config_name: Computer_Science
data_files:
- split: validation
path: Computer_Science/validation-*
- config_name: Design
data_files:
- split: validation
path: Design/validation-*
- config_name: Diagnostics_and_Laboratory_Medicine
data_files:
- split: validation
path: Diagnostics_and_Laboratory_Medicine/validation-*
- config_name: Economics
data_files:
- split: validation
path: Economics/validation-*
- config_name: Electronics
data_files:
- split: validation
path: Electronics/validation-*
- config_name: Energy_and_Power
data_files:
- split: validation
path: Energy_and_Power/validation-*
- config_name: Finance
data_files:
- split: validation
path: Finance/validation-*
- config_name: Geography
data_files:
- split: validation
path: Geography/validation-*
- config_name: History
data_files:
- split: validation
path: History/validation-*
- config_name: Literature
data_files:
- split: validation
path: Literature/validation-*
- config_name: Manage
data_files:
- split: validation
path: Manage/validation-*
- config_name: Marketing
data_files:
- split: validation
path: Marketing/validation-*
- config_name: Materials
data_files:
- split: validation
path: Materials/validation-*
- config_name: Math
data_files:
- split: validation
path: Math/validation-*
- config_name: Mechanical_Engineering
data_files:
- split: validation
path: Mechanical_Engineering/validation-*
- config_name: Music
data_files:
- split: validation
path: Music/validation-*
- config_name: Pharmacy
data_files:
- split: validation
path: Pharmacy/validation-*
- config_name: Physics
data_files:
- split: validation
path: Physics/validation-*
- config_name: Psychology
data_files:
- split: validation
path: Psychology/validation-*
- config_name: Public_Health
data_files:
- split: validation
path: Public_Health/validation-*
- config_name: Sociology
data_files:
- split: validation
path: Sociology/validation-*
---
|
hugosousa/professor_heideltime_en | ---
annotations_creators:
- machine-generated
language:
- en
- fr
- pt
- de
- fr
- it
- es
language_creators:
- found
license:
- mit
multilinguality:
- multilingual
pretty_name: Professor HeidelTime
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- Timex
- Timexs
- Temporal Expression
- Temporal Expressions
- Temporal Information
- Timex Identification
- Timex Classification
- Timex Extraction
task_categories:
- token-classification
task_ids:
- parsing
- part-of-speech
- named-entity-recognition
configs:
- config_name: portuguese
data_files: "portuguese.json"
- config_name: english
data_files: "english.json"
- config_name: french
data_files: "french.json"
- config_name: italian
data_files: "italian.json"
- config_name: spanish
data_files: "spanish.json"
- config_name: german
data_files: "german.json"
---
# Professor HeidelTime
[](https://dl.acm.org/doi/10.1145/3583780.3615130)
[](https://github.com/hmosousa/professor_heideltime)
Professor HeidelTime is a project to create a multilingual corpus weakly labeled with [HeidelTime](https://github.com/HeidelTime/heideltime), a temporal tagger.
## Corpus Details
The weak labeling was performed in six languages. Here are the specifics of the corpus for each language:
| Dataset | Language | Documents | From | To | Tokens | Timexs |
| ----------------------- | -------- | --------- | ---------- | ---------- | ---------- | -------- |
| All the News 2.0 | EN | 24,642 | 2016-01-01 | 2020-04-02 | 18,755,616 | 254,803 |
| Italian Crime News | IT | 9,619 | 2011-01-01 | 2021-12-31 | 3,296,898 | 58,823 |
| German News Dataset | DE | 33,266 | 2003-01-01 | 2022-12-31 | 21,617,888 | 348,011 |
| ElMundo News | ES | 19,095 | 2005-12-02 | 2021-10-18 | 12,515,410 | 194,043 |
| French Financial News | FR | 24,293 | 2017-10-19 | 2021-03-19 | 1,673,053 | 83,431 |
| Público News | PT | 27,154 | 2000-11-14 | 2002-03-20 | 5,929,377 | 111,810 |
## Contact
For more information, reach out to [Hugo Sousa](https://hugosousa.net) at <hugo.o.sousa@inesctec.pt>.
This framework is a part of the [Text2Story](https://text2story.inesctec.pt) project. This project is financed by the ERDF – European Regional Development Fund through the North Portugal Regional Operational Programme (NORTE 2020), under the PORTUGAL 2020 and by National Funds through the Portuguese funding agency, FCT - Fundação para a Ciência e a Tecnologia within project PTDC/CCI-COM/31857/2017 (NORTE-01-0145-FEDER-03185).
## Cite
If you use this work, please cite the following [paper](https://dl.acm.org/doi/10.1145/3583780.3615130):
```bibtex
@inproceedings{10.1145/3583780.3615130,
author = {Sousa, Hugo and Campos, Ricardo and Jorge, Al\'{\i}pio},
title = {TEI2GO: A Multilingual Approach for Fast Temporal Expression Identification},
year = {2023},
isbn = {9798400701245},
publisher = {Association for Computing Machinery},
url = {https://doi.org/10.1145/3583780.3615130},
doi = {10.1145/3583780.3615130},
booktitle = {Proceedings of the 32nd ACM International Conference on Information and Knowledge Management},
pages = {5401–5406},
numpages = {6},
keywords = {temporal expression identification, multilingual corpus, weak label},
location = {Birmingham, United Kingdom},
series = {CIKM '23}
}
```
|
DeepFoldProtein/foldseek_not_in_afdb | ---
dataset_info:
features:
- name: id
dtype: string
- name: seq
dtype: string
splits:
- name: train
num_bytes: 4869
num_examples: 9
download_size: 9213
dataset_size: 4869
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vikp/pypi_clean | ---
dataset_info:
features:
- name: code
dtype: string
- name: package
dtype: string
- name: path
dtype: string
- name: filename
dtype: string
splits:
- name: train
num_bytes: 31543801750
num_examples: 2438172
download_size: 9201420527
dataset_size: 31543801750
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pypi_clean"
All of the latest package versions from pypi. The original data came from [here](https://py-code.org/datasets). I pulled the latest versions of each package, then extracted only `md`, `rst`, `ipynb`, and `py` files.
I then applied some cleaning:
- rendering notebooks
- removing leading comments/licenses |
Icchan/IhKamuKepoDeh | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_rte_no_preverbal_negator | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 47861
num_examples: 107
- name: train
num_bytes: 41336
num_examples: 83
download_size: 69033
dataset_size: 89197
---
# Dataset Card for "MULTI_VALUE_rte_no_preverbal_negator"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dampish/Orion-Eval | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 3769602
num_examples: 350
download_size: 757991
dataset_size: 3769602
---
# Dataset Card for "Orion-Eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anz2/NASA_OSDR | ---
license: apache-2.0
---
|
furkanakkurt1618/pos_dataset-UD_Turkish-BOUN-v2.13 | ---
license: cc-by-sa-4.0
task_categories:
- token-classification
language:
- tr
pretty_name: UD Turkish BOUN Treebank POS Tagging
size_categories:
- 1K<n<10K
--- |
SatyamSSJ10/YorForger | ---
license: openrail
task_categories:
- image-to-text
pretty_name: YorForger
size_categories:
- n<1K
---
Trained on 29 N/SFW Yor Forger images but don't Worry! The SFW will work unexpectedly good! |
kiitunp/MarieFrance | ---
task_categories:
- image-classification
language:
- fr
tags:
- woman
- natural
- body
- face
- Marie-France
size_categories:
- n<1K
--- |
ibivibiv/alpaca_tasksource9 | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 135882022
num_examples: 253970
download_size: 77275484
dataset_size: 135882022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nekohacker591/test21 | ---
license: other
license_name: idkc
license_link: LICENSE
---
|
open-llm-leaderboard/details_aihub-app__ZySec-7B-v1 | ---
pretty_name: Evaluation run of aihub-app/ZySec-7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aihub-app/ZySec-7B-v1](https://huggingface.co/aihub-app/ZySec-7B-v1) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aihub-app__ZySec-7B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T13:29:55.767663](https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__ZySec-7B-v1/blob/main/results_2024-01-28T13-29-55.767663.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5992943703774103,\n\
\ \"acc_stderr\": 0.03331859785777884,\n \"acc_norm\": 0.6062149526919669,\n\
\ \"acc_norm_stderr\": 0.03403027227512744,\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5649322732323967,\n\
\ \"mc2_stderr\": 0.016365165663274596\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809181,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6620195180242979,\n\
\ \"acc_stderr\": 0.004720551323547126,\n \"acc_norm\": 0.8501294562836088,\n\
\ \"acc_norm_stderr\": 0.003562149890962717\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.0295822451283843,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.0295822451283843\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667493,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667493\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.0249393139069408,\n \
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.0249393139069408\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630797,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630797\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.033769221512523345,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.033769221512523345\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371163,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371163\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165555,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n\
\ \"acc_stderr\": 0.015318257745976708,\n \"acc_norm\": 0.2994413407821229,\n\
\ \"acc_norm_stderr\": 0.015318257745976708\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602656,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602656\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n\
\ \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n\
\ \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.01952431674486635,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.01952431674486635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\
\ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\
\ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5649322732323967,\n\
\ \"mc2_stderr\": 0.016365165663274596\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773229\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.23199393479909022,\n \
\ \"acc_stderr\": 0.01162687317509241\n }\n}\n```"
repo_url: https://huggingface.co/aihub-app/ZySec-7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|arc:challenge|25_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|gsm8k|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hellaswag|10_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-29-55.767663.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T13-29-55.767663.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- '**/details_harness|winogrande|5_2024-01-28T13-29-55.767663.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T13-29-55.767663.parquet'
- config_name: results
data_files:
- split: 2024_01_28T13_29_55.767663
path:
- results_2024-01-28T13-29-55.767663.parquet
- split: latest
path:
- results_2024-01-28T13-29-55.767663.parquet
---
# Dataset Card for Evaluation run of aihub-app/ZySec-7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aihub-app/ZySec-7B-v1](https://huggingface.co/aihub-app/ZySec-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aihub-app__ZySec-7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T13:29:55.767663](https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__ZySec-7B-v1/blob/main/results_2024-01-28T13-29-55.767663.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5992943703774103,
"acc_stderr": 0.03331859785777884,
"acc_norm": 0.6062149526919669,
"acc_norm_stderr": 0.03403027227512744,
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5649322732323967,
"mc2_stderr": 0.016365165663274596
},
"harness|arc:challenge|25": {
"acc": 0.5998293515358362,
"acc_stderr": 0.014317197787809181,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.6620195180242979,
"acc_stderr": 0.004720551323547126,
"acc_norm": 0.8501294562836088,
"acc_norm_stderr": 0.003562149890962717
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.0295822451283843,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.0295822451283843
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667493,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667493
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601684,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601684
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.0249393139069408,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.0249393139069408
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630797,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630797
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.033769221512523345,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.033769221512523345
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371163,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371163
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165555,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2994413407821229,
"acc_stderr": 0.015318257745976708,
"acc_norm": 0.2994413407821229,
"acc_norm_stderr": 0.015318257745976708
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602656,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602656
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.01952431674486635,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.01952431674486635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5649322732323967,
"mc2_stderr": 0.016365165663274596
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773229
},
"harness|gsm8k|5": {
"acc": 0.23199393479909022,
"acc_stderr": 0.01162687317509241
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CVasNLPExperiments/DTD_parition1_test_google_flan_t5_xxl_mode_C_A_ns_1880 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 823882
num_examples: 1880
download_size: 221717
dataset_size: 823882
---
# Dataset Card for "DTD_parition1_test_google_flan_t5_xxl_mode_C_A_ns_1880"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
David-Xu/astronomy-stack-dpo-text-20-percent | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 9764728
num_examples: 3588
- name: test
num_bytes: 1187244
num_examples: 398
download_size: 3288117
dataset_size: 10951972
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ricardosantoss/top_12_portuguese | ---
license: unknown
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Nota_Clinica
dtype: string
- name: Sequencia_CID10_Lista
sequence: string
splits:
- name: train
num_bytes: 809003
num_examples: 799
- name: test
num_bytes: 211988
num_examples: 200
download_size: 321221
dataset_size: 1020991
---
|
speechcolab/gigaspeech | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: Gigaspeech
source_datasets: []
task_categories:
- automatic-speech-recognition
- text-to-speech
- text-to-audio
extra_gated_prompt: >-
SpeechColab does not own the copyright of the audio files. For researchers and
educators who wish to use the audio files for non-commercial research and/or
educational purposes, we can provide access through the Hub under certain
conditions and terms.
Terms of Access:
The "Researcher" has requested permission to use the GigaSpeech database (the
"Database") at Tsinghua University. In exchange for such permission,
Researcher hereby agrees to the following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and
educational purposes.
2. The SpeechColab team and Tsinghua University make no representations or
warranties regarding the Database, including but not limited to warranties of
non-infringement or fitness for a particular purpose.
3. Researcher accepts full responsibility for his or her use of the Database
and shall defend and indemnify the SpeechColab team and Tsinghua University,
including their employees, Trustees, officers and agents, against any and all
claims arising from Researcher's use of the Database, including but not
limited to Researcher's use of any copies of copyrighted audio files that he
or she may create from the Database.
4. Researcher may provide research associates and colleagues with access to
the Database provided that they first agree to be bound by these terms and
conditions.
5. The SpeechColab team and Tsinghua University reserve the right to terminate
Researcher's access to the Database at any time.
6. If Researcher is employed by a for-profit, commercial entity, Researcher's
employer shall also be bound by these terms and conditions, and Researcher
hereby represents that he or she is fully authorized to enter into this
agreement on behalf of such employer.
!!! Please also fill out the Google Form https://forms.gle/UuGQAPyscGRrUMLq6
to request access to the Gigaspeech dataset.
extra_gated_fields:
Name: text
Email: text
Organization: text
Address: text
I hereby confirm that I have requested access via the Google Form provided above: checkbox
I accept the terms of access: checkbox
---
# Dataset Card for Gigaspeech
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
- [Terms of Access](#terms-of-access)
## Dataset Description
- **Homepage:** https://github.com/SpeechColab/GigaSpeech
- **Repository:** https://github.com/SpeechColab/GigaSpeech
- **Paper:** https://arxiv.org/abs/2106.06909
- **Leaderboard:** https://github.com/SpeechColab/GigaSpeech#leaderboard
- **Point of Contact:** [gigaspeech@speechcolab.org](mailto:gigaspeech@speechcolab.org)
## Dataset Description
GigaSpeech is an evolving, multi-domain English speech recognition corpus with 10,000 hours of high quality labeled audio suitable for supervised training. The transcribed audio data is collected from audiobooks, podcasts and YouTube, covering both read and spontaneous speaking styles, and a variety of topics, such as arts, science, sports, etc.
### Example Usage
The training split has several configurations of various size:
XS, S, M, L, XL. See the Section on "Data Splits" for more information. To download the XS configuration:
```python
from datasets import load_dataset
gs = load_dataset("speechcolab/gigaspeech", "xs", use_auth_token=True)
# see structure
print(gs)
# load audio sample on the fly
audio_input = gs["train"][0]["audio"] # first decoded audio sample
transcription = gs["train"][0]["text"] # first transcription
```
It is possible to download only the development or test data:
```python
gs_dev = load_dataset("speechcolab/gigaspeech", "dev", use_auth_token=True)
gs_test = load_dataset("speechcolab/gigaspeech", "test", use_auth_token=True)
```
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER). The task has an active leaderboard which can be found at https://github.com/SpeechColab/GigaSpeech#leaderboard and ranks models based on their WER.
- `text-to-speech`, `text-to-audio`: The dataset can also be used to train a model for Text-To-Speech (TTS).
### Languages
Gigaspeech contains audio and transcription data in English.
## Dataset Structure
### Data Instances
```python
{
'segment_id': 'YOU0000000315_S0000660',
'speaker': 'N/A',
'text': "AS THEY'RE LEAVING <COMMA> CAN KASH PULL ZAHRA ASIDE REALLY QUICKLY <QUESTIONMARK>",
'audio':
{
# in streaming mode 'path' will be 'xs_chunks_0000/YOU0000000315_S0000660.wav'
'path': '/home/user/.cache/huggingface/datasets/downloads/extracted/9d48cf31/xs_chunks_0000/YOU0000000315_S0000660.wav',
'array': array([0.0005188 , 0.00085449, 0.00012207, ..., 0.00125122, 0.00076294, 0.00036621], dtype=float32),
'sampling_rate': 16000
},
'begin_time': 2941.889892578125,
'end_time': 2945.070068359375,
'audio_id': 'YOU0000000315',
'title': 'Return to Vasselheim | Critical Role: VOX MACHINA | Episode 43',
'url': 'https://www.youtube.com/watch?v=zr2n1fLVasU',
'source': 2,
'category': 24,
'original_full_path': 'audio/youtube/P0004/YOU0000000315.opus'
}
```
### Data Fields
* segment_id (string) - string id of the segment.
* speaker (string) - string id of the speaker (can be "N/A").
* text (string) - transcription of the segment.
* begin_time (float) - start time of the segment in an original full audio.
* end_time (float32) - end time of the segment in an original full audio.
* audio (Audio feature) - a dictionary containing the path to the audio, the decoded audio array, and the sampling rate.
In non-streaming mode (default), the path point to the locally extracted audio. In streaming mode, the path is the relative path of an audio.
segment inside its archive (as files are not downloaded and extracted locally).
* audio_id (string) - string idea of the original full audio.
* title (string) - title of the original full audio.
* url (string) - url of the original full audio.
* source (ClassLabel) - id of the audio source. Sources are audiobook (0), podcast (1), and YouYube (2).
* category (ClassLabel) - id of the audio category, categories are listed below.
* original_full_path (string) - the relative path to the original full audio sample in the original data directory.
Categories are assigned from the following labels:
"People and Blogs", "Business", "Nonprofits and Activism", "Crime", "History", "Pets and Animals",
"News and Politics", "Travel and Events", "Kids and Family", "Leisure", "N/A", "Comedy", "News and Politics",
"Sports", "Arts", "Science and Technology", "Autos and Vehicles", "Science and Technology", "People and Blogs",
"Music", "Society and Culture", "Education", "Howto and Style", "Film and Animation", "Gaming", "Entertainment",
"Travel and Events", "Health and Fitness", "audiobook".
### Data Splits
The dataset has three splits: train, evaluation (dev) and test. The train split has five configurations of various sizes:
XS, S, M, L, XL. Larger subsets are supersets of smaller subsets, e.g., the L subset contains all the data from the M subset.
#### Transcribed Training Subsets Size
| Subset | Hours | Remarks |
|:---------------:|:-------------:|:-------------|
| XS | 10 | System building and debugging |
| S | 250 | Quick research experiments |
| M | 1,000 | Large-scale research experiments |
| L | 2,500 | Medium-scale industrial experiments |
| XL | 10,000 | Large-scale industrial experiments |
#### Transcribed Evaluation Subsets
| Subset | Hours | Remarks |
|:------:|:-----:|:--------|
| Dev | 12 | Randomly selected from the crawled Podcast and YouTube Data |
| Test | 40 | Part of the subset was randomly selected from the crawled Podcast and YouTube data; part of it was manually collected through other channels to have better coverage. |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
| Audio Source | Transcribed Hours | Acoustic Condition |
|:-------------|:----------------------:|:-------------------|
| Audiobook | 2,655 | <li>Reading</li><li>Various ages and accents</li> |
| Podcast | 3,498 | <li>Clean or background music</li><li>Indoor</li><li>Near-field</li><li>Spontaneous</li><li>Various ages and accents</li>|
| YouTube | 3,845 | <li>Clean and noisy</li><li>Indoor and outdoor</li><li>Near- and far-field</li><li>Reading and spontaneous</li><li>Various ages and accents</li> |
| ***Total*** | ***10,000*** ||
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
Development and test subsets are annotated by professional human annotators.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
SpeechColab does not own the copyright of the audio files. For researchers and educators who wish to use the audio files for
non-commercial research and/or educational purposes, we can provide access through our site under certain conditions and terms.
In general, when training a machine learning model on a given dataset, the license of the model is **independent** to that of the
dataset. That is to say, speech recognition models trained on the GigaSpeech dataset may be eligible for commercial license,
provided they abide to the 'Fair Use' terms of the underlying data and do not violate any explicit copyright restrictions.
This is likely to be true in most use-cases. However, it is your responsiblity to verify the appropriate model license for
your specific use-case by confirming that the dataset usage abides by the Fair Use terms. SpeechColab is not responsible
for the license of any machine learning model trained on the GigaSpeech dataset.
### Citation Information
Please cite this paper if you find this work useful:
```bibtext
@inproceedings{GigaSpeech2021,
title={GigaSpeech: An Evolving, Multi-domain ASR Corpus with 10,000 Hours of Transcribed Audio},
booktitle={Proc. Interspeech 2021},
year=2021,
author={Guoguo Chen, Shuzhou Chai, Guanbo Wang, Jiayu Du, Wei-Qiang Zhang, Chao Weng, Dan Su, Daniel Povey, Jan Trmal, Junbo Zhang, Mingjie Jin, Sanjeev Khudanpur, Shinji Watanabe, Shuaijiang Zhao, Wei Zou, Xiangang Li, Xuchen Yao, Yongqing Wang, Yujun Wang, Zhao You, Zhiyong Yan}
}
```
### Contributions
Thanks to [@polinaeterna](https://github.com/polinaeterna) and [@sanchit-gandhi](https://github.com/sanchit-gandhi)
for adding this dataset.
## Terms of Access
The "Researcher" has requested permission to use the GigaSpeech database (the "Database")
at Tsinghua University. In exchange for such permission, Researcher hereby agrees to the
following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and educational purposes.
2. The SpeechColab team and Tsinghua University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.
3. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the SpeechColab team and Tsinghua University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted audio files that he or she may create from the Database.
4. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.
5. The SpeechColab team and Tsinghua University reserve the right to terminate Researcher's access to the Database at any time.
6. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer. |
khh4vdd/gggbg | ---
license: other
---
|
RikoteMaster/isear_augmented_sample | ---
dataset_info:
features:
- name: Text_processed
dtype: string
- name: Emotion
dtype: string
- name: Augmented
dtype: bool
splits:
- name: train
num_bytes: 9254
num_examples: 63
- name: test
num_bytes: 10464
num_examples: 63
- name: validation
num_bytes: 9886
num_examples: 63
download_size: 26524
dataset_size: 29604
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-4b | ---
pretty_name: Evaluation run of Azure99/blossom-v4-qwen1_5-4b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azure99/blossom-v4-qwen1_5-4b](https://huggingface.co/Azure99/blossom-v4-qwen1_5-4b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-4b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T16:11:51.291866](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-4b/blob/main/results_2024-02-19T16-11-51.291866.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5500335073961169,\n\
\ \"acc_stderr\": 0.034103061156946904,\n \"acc_norm\": 0.5522425202997023,\n\
\ \"acc_norm_stderr\": 0.03479721070253822,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.47286479272599524,\n\
\ \"mc2_stderr\": 0.015086620345628354\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42662116040955633,\n \"acc_stderr\": 0.014453185592920293,\n\
\ \"acc_norm\": 0.46075085324232085,\n \"acc_norm_stderr\": 0.014566303676636584\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5244971121290579,\n\
\ \"acc_stderr\": 0.00498378899268121,\n \"acc_norm\": 0.7080262895837482,\n\
\ \"acc_norm_stderr\": 0.0045374106155729454\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115205,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115205\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n\
\ \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n\
\ \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.03308818594415751,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.03308818594415751\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255168,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255168\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7247706422018348,\n \"acc_stderr\": 0.0191490937431552,\n \"acc_norm\"\
: 0.7247706422018348,\n \"acc_norm_stderr\": 0.0191490937431552\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.03372343271653063,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.03372343271653063\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.032566854844603886,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.032566854844603886\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293433,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293433\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514511,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514511\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209807,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209807\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7254150702426565,\n\
\ \"acc_stderr\": 0.015959829933084042,\n \"acc_norm\": 0.7254150702426565,\n\
\ \"acc_norm_stderr\": 0.015959829933084042\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584197,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584197\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208173,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208173\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023334,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027125115513166854,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027125115513166854\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.01263579992276585,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.01263579992276585\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5277777777777778,\n \"acc_stderr\": 0.020196594933541197,\n \
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.020196594933541197\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348642,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348642\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.03546976959393163,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.03546976959393163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.47286479272599524,\n\
\ \"mc2_stderr\": 0.015086620345628354\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6764009471191792,\n \"acc_stderr\": 0.013148883320923151\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5109931766489765,\n \
\ \"acc_stderr\": 0.013769155509690907\n }\n}\n```"
repo_url: https://huggingface.co/Azure99/blossom-v4-qwen1_5-4b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|arc:challenge|25_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|gsm8k|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hellaswag|10_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-11-51.291866.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T16-11-51.291866.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- '**/details_harness|winogrande|5_2024-02-19T16-11-51.291866.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T16-11-51.291866.parquet'
- config_name: results
data_files:
- split: 2024_02_19T16_11_51.291866
path:
- results_2024-02-19T16-11-51.291866.parquet
- split: latest
path:
- results_2024-02-19T16-11-51.291866.parquet
---
# Dataset Card for Evaluation run of Azure99/blossom-v4-qwen1_5-4b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azure99/blossom-v4-qwen1_5-4b](https://huggingface.co/Azure99/blossom-v4-qwen1_5-4b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-4b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T16:11:51.291866](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-4b/blob/main/results_2024-02-19T16-11-51.291866.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5500335073961169,
"acc_stderr": 0.034103061156946904,
"acc_norm": 0.5522425202997023,
"acc_norm_stderr": 0.03479721070253822,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.47286479272599524,
"mc2_stderr": 0.015086620345628354
},
"harness|arc:challenge|25": {
"acc": 0.42662116040955633,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.46075085324232085,
"acc_norm_stderr": 0.014566303676636584
},
"harness|hellaswag|10": {
"acc": 0.5244971121290579,
"acc_stderr": 0.00498378899268121,
"acc_norm": 0.7080262895837482,
"acc_norm_stderr": 0.0045374106155729454
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115205,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115205
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.03308818594415751,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.03308818594415751
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255168,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7247706422018348,
"acc_stderr": 0.0191490937431552,
"acc_norm": 0.7247706422018348,
"acc_norm_stderr": 0.0191490937431552
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.032566854844603886,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.032566854844603886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293433,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293433
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514511,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514511
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209807,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209807
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7254150702426565,
"acc_stderr": 0.015959829933084042,
"acc_norm": 0.7254150702426565,
"acc_norm_stderr": 0.015959829933084042
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584197,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584197
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208173,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208173
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023334,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027125115513166854,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027125115513166854
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.01263579992276585,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.01263579992276585
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.48161764705882354,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.48161764705882354,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.020196594933541197,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.020196594933541197
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348642,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348642
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.47286479272599524,
"mc2_stderr": 0.015086620345628354
},
"harness|winogrande|5": {
"acc": 0.6764009471191792,
"acc_stderr": 0.013148883320923151
},
"harness|gsm8k|5": {
"acc": 0.5109931766489765,
"acc_stderr": 0.013769155509690907
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
communityai/aptchat-v2-math-code-general-50k | ---
dataset_info:
features:
- name: category
dtype: string
- name: total_tokens
dtype: int64
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 618500560.0
num_examples: 48639
download_size: 281022915
dataset_size: 618500560.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chathuranga-jayanath/context-5-from-finmath-time4j-html-mavendoxia-portion-0.4-prompt-1 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: filepath
dtype: string
- name: start_bug_line
dtype: int64
- name: end_bug_line
dtype: int64
- name: bug
dtype: string
- name: fix
dtype: string
- name: ctx
dtype: string
splits:
- name: train
num_bytes: 68692657
num_examples: 78649
- name: validation
num_bytes: 8622835
num_examples: 9831
- name: test
num_bytes: 8597201
num_examples: 9831
download_size: 31117615
dataset_size: 85912693
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
TrainingDataPro/display-spoof-attack | ---
license: cc-by-nc-nd-4.0
task_categories:
- video-classification
- image-to-video
language:
- en
tags:
- code
- finance
- legal
---
# Liveness Detection - Video Classification
The biometric attack dataset with **replay attacks** on the real videos of people. **Replay attack** involves presenting a pre-recorded video or previously captured footage as if it were occurring in real-time.
The primary objective is to distinguish between genuine, real-time footage and manipulated recordings.
The videos were gathered by capturing faces of genuine individuals presenting spoofs, using facial presentations. Our dataset proposes a novel approach that learns and detects spoofing techniques, extracting features from the genuine facial images to prevent the capturing of such information by fake users.
The dataset contains videos of real humans with various **resolutions, views, and colors**, making it a comprehensive resource for researchers working on anti-spoofing technologies.

The dataset provides data to combine and apply different techniques, approaches, and models to address the challenging task of distinguishing between genuine and spoofed inputs, providing effective anti-spoofing solutions in active authentication systems. These solutions are crucial as newer devices, such as phones, have become vulnerable to spoofing attacks due to the availability of technologies that can create replays, reflections, and depths, making them susceptible to spoofing and generalization.
### People in the dataset

Our dataset also explores the use of neural architectures, such as deep neural networks, to facilitate the identification of distinguishing patterns and textures in different regions of the face, increasing the accuracy and generalizability of the anti-spoofing models.
# 💴 For Commercial Usage: Full version of the dataset includes 30,000+ videos, leave a request on **[TrainingData](https://trainingdata.pro/data-market/anti-spoofing-replay?utm_source=huggingface&utm_medium=cpc&utm_campaign=display-spoof)** to buy the dataset
### Metadata for the full dataset:
- **replay.assignment_id** - unique identifier of the media file
- **real_assignment_id**- unique identifier of the media file from the [Antispoofing Real Dataset](https://trainingdata.pro/data-market/antispoofing-real?utm_source=kaggle&utm_medium=cpc&utm_campaign=antispoofing-replay-dataset)
- **worker_id** - unique identifier of the person
- **age** - age of the person
- **true_gender** - gender of the person
- **country** - country of the person
- **ethnicity** - ethnicity of the person
- **video_extension** - video extensions in the dataset
- **video_resolution** - video resolution in the dataset
- **video_duration** - video duration in the dataset
- **video_fps** - frames per second for video in the dataset
# 💴 Buy the Dataset: This is just an example of the data. Leave a request on **[https://trainingdata.pro/data-market](https://trainingdata.pro/data-market/anti-spoofing-replay?utm_source=huggingface&utm_medium=cpc&utm_campaign=display-spoof) to learn about the price and buy the dataset**
# Content
The dataset includes **files** folder with videos of people
### File with the extension .csv
- **id**: id of the person,
- **file**: link to access the display spoof attack video
## **[TrainingData](https://trainingdata.pro/data-market/anti-spoofing-replay?utm_source=huggingface&utm_medium=cpc&utm_campaign=display-spoof)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **<https://www.kaggle.com/trainingdatapro/datasets>**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets**
*keywords: liveness detection systems, liveness detection dataset, biometric dataset, biometric data dataset, biometric system attacks, anti-spoofing dataset, face liveness detection, deep learning dataset, face spoofing database, face anti-spoofing, ibeta dataset, face anti spoofing, large-scale face anti spoofing, rich annotations anti spoofing dataset* |
EmilMarian/BOLA-Karate-DSL-Dataset | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_yam-peleg__Experiment4-7B | ---
pretty_name: Evaluation run of yam-peleg/Experiment4-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yam-peleg/Experiment4-7B](https://huggingface.co/yam-peleg/Experiment4-7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__Experiment4-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T12:47:14.139387](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment4-7B/blob/main/results_2024-02-11T12-47-14.139387.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545438799099946,\n\
\ \"acc_stderr\": 0.03201109405695293,\n \"acc_norm\": 0.6554330760311358,\n\
\ \"acc_norm_stderr\": 0.032658616723143415,\n \"mc1\": 0.5642594859241126,\n\
\ \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.7039319058753165,\n\
\ \"mc2_stderr\": 0.014998717036441298\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244482,\n\
\ \"acc_norm\": 0.7218430034129693,\n \"acc_norm_stderr\": 0.013094469919538805\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7112129057956582,\n\
\ \"acc_stderr\": 0.004522725412556956,\n \"acc_norm\": 0.8809002190798646,\n\
\ \"acc_norm_stderr\": 0.003232439139881551\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400506,\n \"\
acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400506\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"\
acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45027932960893857,\n\
\ \"acc_stderr\": 0.016639615236845814,\n \"acc_norm\": 0.45027932960893857,\n\
\ \"acc_norm_stderr\": 0.016639615236845814\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5642594859241126,\n\
\ \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.7039319058753165,\n\
\ \"mc2_stderr\": 0.014998717036441298\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019815\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6345716451857468,\n \
\ \"acc_stderr\": 0.013264282030266637\n }\n}\n```"
repo_url: https://huggingface.co/yam-peleg/Experiment4-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|arc:challenge|25_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|gsm8k|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hellaswag|10_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T12-47-14.139387.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T12-47-14.139387.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- '**/details_harness|winogrande|5_2024-02-11T12-47-14.139387.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T12-47-14.139387.parquet'
- config_name: results
data_files:
- split: 2024_02_11T12_47_14.139387
path:
- results_2024-02-11T12-47-14.139387.parquet
- split: latest
path:
- results_2024-02-11T12-47-14.139387.parquet
---
# Dataset Card for Evaluation run of yam-peleg/Experiment4-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/Experiment4-7B](https://huggingface.co/yam-peleg/Experiment4-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__Experiment4-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T12:47:14.139387](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment4-7B/blob/main/results_2024-02-11T12-47-14.139387.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6545438799099946,
"acc_stderr": 0.03201109405695293,
"acc_norm": 0.6554330760311358,
"acc_norm_stderr": 0.032658616723143415,
"mc1": 0.5642594859241126,
"mc1_stderr": 0.01735834539886313,
"mc2": 0.7039319058753165,
"mc2_stderr": 0.014998717036441298
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244482,
"acc_norm": 0.7218430034129693,
"acc_norm_stderr": 0.013094469919538805
},
"harness|hellaswag|10": {
"acc": 0.7112129057956582,
"acc_stderr": 0.004522725412556956,
"acc_norm": 0.8809002190798646,
"acc_norm_stderr": 0.003232439139881551
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400506,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45027932960893857,
"acc_stderr": 0.016639615236845814,
"acc_norm": 0.45027932960893857,
"acc_norm_stderr": 0.016639615236845814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5642594859241126,
"mc1_stderr": 0.01735834539886313,
"mc2": 0.7039319058753165,
"mc2_stderr": 0.014998717036441298
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019815
},
"harness|gsm8k|5": {
"acc": 0.6345716451857468,
"acc_stderr": 0.013264282030266637
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
EliKet/miumiu | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: image_name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 21220034.0
num_examples: 18
download_size: 21212241
dataset_size: 21220034.0
---
# Dataset Card for "miumiu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lzkhit/images | ---
license: apache-2.0
---
|
Aruno/UTKFace-gemini | ---
task_categories:
- image-classification
language:
- en
pretty_name: UTKFace Gemini Annotation
size_categories:
- 1K<n<10K
---
[UTKFace](https://susanqq.github.io/UTKFace/) dataset annotated using [Google Gemini](https://deepmind.google/technologies/gemini/).
This dataset only contains annotation and not the image itself. (Json file name corresponds to image file name)
* Used model: `gemini-pro-vision`
## Format
```json
{
"sex":male/female,
"attractiveness":very ugly/ugly/normal/attractive/very attractive,
"age":young child/child/adolescent/young adult/adult/young senior/senior/old/very old,
"character":kind/jealous/violent/frienly/playboy/intersting/boring,
"description":string,
"expression":angry/disgust/ear/happy/neutral/sad/surprise
}
```
## Used prompt
```
Evaluate the image as below:
* sex: sex of the face
* age: how old look the person
* attractiveness: level of attractiveness
* character: character of the face
* description: description of the image
* expression: facial expression
* Output following below JSON format (do not include markdown format, all field must be filled)
{"sex":male/female, "attractiveness":very ugly/ugly/normal/attractive/very attractive, "age":young child/child/adolescent/young adult/adult/young senior/senior/old/very old, "character":kind/jealous/violent/frienly/playboy/intersting/boring, "description":string, "expression":angry/disgust/ear/happy/neutral/sad/surprise}
``` |
open-llm-leaderboard/details_automerger__OgnoExperiment27-7B | ---
pretty_name: Evaluation run of automerger/OgnoExperiment27-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [automerger/OgnoExperiment27-7B](https://huggingface.co/automerger/OgnoExperiment27-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_automerger__OgnoExperiment27-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T04:37:37.340803](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__OgnoExperiment27-7B/blob/main/results_2024-03-11T04-37-37.340803.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6507742608726995,\n\
\ \"acc_stderr\": 0.03214096587643815,\n \"acc_norm\": 0.6500897932522264,\n\
\ \"acc_norm_stderr\": 0.03281422384865019,\n \"mc1\": 0.6389228886168911,\n\
\ \"mc1_stderr\": 0.01681431284483688,\n \"mc2\": 0.7840953221504796,\n\
\ \"mc2_stderr\": 0.013622329121050615\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274777,\n\
\ \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523198\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7218681537542322,\n\
\ \"acc_stderr\": 0.004471629546895095,\n \"acc_norm\": 0.8940450109539932,\n\
\ \"acc_norm_stderr\": 0.0030715098609056667\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6389228886168911,\n\
\ \"mc1_stderr\": 0.01681431284483688,\n \"mc2\": 0.7840953221504796,\n\
\ \"mc2_stderr\": 0.013622329121050615\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \
\ \"acc_stderr\": 0.012782681251053194\n }\n}\n```"
repo_url: https://huggingface.co/automerger/OgnoExperiment27-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|arc:challenge|25_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|gsm8k|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hellaswag|10_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-37-37.340803.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T04-37-37.340803.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- '**/details_harness|winogrande|5_2024-03-11T04-37-37.340803.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T04-37-37.340803.parquet'
- config_name: results
data_files:
- split: 2024_03_11T04_37_37.340803
path:
- results_2024-03-11T04-37-37.340803.parquet
- split: latest
path:
- results_2024-03-11T04-37-37.340803.parquet
---
# Dataset Card for Evaluation run of automerger/OgnoExperiment27-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [automerger/OgnoExperiment27-7B](https://huggingface.co/automerger/OgnoExperiment27-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_automerger__OgnoExperiment27-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T04:37:37.340803](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__OgnoExperiment27-7B/blob/main/results_2024-03-11T04-37-37.340803.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6507742608726995,
"acc_stderr": 0.03214096587643815,
"acc_norm": 0.6500897932522264,
"acc_norm_stderr": 0.03281422384865019,
"mc1": 0.6389228886168911,
"mc1_stderr": 0.01681431284483688,
"mc2": 0.7840953221504796,
"mc2_stderr": 0.013622329121050615
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274777,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523198
},
"harness|hellaswag|10": {
"acc": 0.7218681537542322,
"acc_stderr": 0.004471629546895095,
"acc_norm": 0.8940450109539932,
"acc_norm_stderr": 0.0030715098609056667
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6389228886168911,
"mc1_stderr": 0.01681431284483688,
"mc2": 0.7840953221504796,
"mc2_stderr": 0.013622329121050615
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571764
},
"harness|gsm8k|5": {
"acc": 0.686125852918878,
"acc_stderr": 0.012782681251053194
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_simonveitner__Math-OpenHermes-2.5-Mistral-7B | ---
pretty_name: Evaluation run of simonveitner/Math-OpenHermes-2.5-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [simonveitner/Math-OpenHermes-2.5-Mistral-7B](https://huggingface.co/simonveitner/Math-OpenHermes-2.5-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_simonveitner__Math-OpenHermes-2.5-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T15:42:58.616928](https://huggingface.co/datasets/open-llm-leaderboard/details_simonveitner__Math-OpenHermes-2.5-Mistral-7B/blob/main/results_2023-12-13T15-42-58.616928.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6335216907991242,\n\
\ \"acc_stderr\": 0.03232537565689536,\n \"acc_norm\": 0.6354467611227453,\n\
\ \"acc_norm_stderr\": 0.0329714707471459,\n \"mc1\": 0.3488372093023256,\n\
\ \"mc1_stderr\": 0.01668441985998689,\n \"mc2\": 0.5090845417111513,\n\
\ \"mc2_stderr\": 0.015345799600128406\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472437,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491887\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6413065126468831,\n\
\ \"acc_stderr\": 0.004786368011500458,\n \"acc_norm\": 0.8307110137422824,\n\
\ \"acc_norm_stderr\": 0.0037424055874098784\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218957,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218957\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.0249393139069408,\n \
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.0249393139069408\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n\
\ \"acc_stderr\": 0.02759917430064077,\n \"acc_norm\": 0.8088235294117647,\n\
\ \"acc_norm_stderr\": 0.02759917430064077\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n\
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834827,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.01446589382985993,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.01446589382985993\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729474,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729474\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n\
\ \"acc_stderr\": 0.012728446067669971,\n \"acc_norm\": 0.4595827900912647,\n\
\ \"acc_norm_stderr\": 0.012728446067669971\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3488372093023256,\n\
\ \"mc1_stderr\": 0.01668441985998689,\n \"mc2\": 0.5090845417111513,\n\
\ \"mc2_stderr\": 0.015345799600128406\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \
\ \"acc_stderr\": 0.01342838248127423\n }\n}\n```"
repo_url: https://huggingface.co/simonveitner/Math-OpenHermes-2.5-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|arc:challenge|25_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|gsm8k|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hellaswag|10_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T15-42-58.616928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T15-42-58.616928.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- '**/details_harness|winogrande|5_2023-12-13T15-42-58.616928.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T15-42-58.616928.parquet'
- config_name: results
data_files:
- split: 2023_12_13T15_42_58.616928
path:
- results_2023-12-13T15-42-58.616928.parquet
- split: latest
path:
- results_2023-12-13T15-42-58.616928.parquet
---
# Dataset Card for Evaluation run of simonveitner/Math-OpenHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [simonveitner/Math-OpenHermes-2.5-Mistral-7B](https://huggingface.co/simonveitner/Math-OpenHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_simonveitner__Math-OpenHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T15:42:58.616928](https://huggingface.co/datasets/open-llm-leaderboard/details_simonveitner__Math-OpenHermes-2.5-Mistral-7B/blob/main/results_2023-12-13T15-42-58.616928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6335216907991242,
"acc_stderr": 0.03232537565689536,
"acc_norm": 0.6354467611227453,
"acc_norm_stderr": 0.0329714707471459,
"mc1": 0.3488372093023256,
"mc1_stderr": 0.01668441985998689,
"mc2": 0.5090845417111513,
"mc2_stderr": 0.015345799600128406
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472437,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491887
},
"harness|hellaswag|10": {
"acc": 0.6413065126468831,
"acc_stderr": 0.004786368011500458,
"acc_norm": 0.8307110137422824,
"acc_norm_stderr": 0.0037424055874098784
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218957,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218957
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.0249393139069408,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.0249393139069408
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064077,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064077
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834827,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.01446589382985993,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.01446589382985993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729474,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729474
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669971,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669971
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.01913994374848704,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.01913994374848704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3488372093023256,
"mc1_stderr": 0.01668441985998689,
"mc2": 0.5090845417111513,
"mc2_stderr": 0.015345799600128406
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663592
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.01342838248127423
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/executor_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of executor_arknights
This is the dataset of executor_arknights, containing 122 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 122 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 268 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 122 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 122 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 122 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 122 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 122 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 268 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 268 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 268 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
SALT-NLP/Design2Code-hf | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 104888755
num_examples: 484
download_size: 77578732
dataset_size: 104888755
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: odc-by
---
This dataset consists of 484 webpages from the C4 validation set, serving the purpose of testing multimodal LLMs on converting visual designs into code implementations.
See the dataset in the raw files format [here](https://huggingface.co/datasets/SALT-NLP/Design2Code).
Note that all images in these webpages are replaced by a placeholder image (rick.jpg)
Please refer to our [project page](https://salt-nlp.github.io/Design2Code/) and [our paper](arxiv.org/abs/2403.03163) for more information. |
iocuydi/amharic-blip-laion | ---
license: cc-by-4.0
---
Dataset used for pretraining clip alignment step of Amharic llava.
More details: https://arxiv.org/abs/2403.06354 |
FaalSa/dataR | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 57629
num_examples: 1
- name: validation
num_bytes: 58109
num_examples: 1
- name: test
num_bytes: 58589
num_examples: 1
download_size: 35539
dataset_size: 174327
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
claudios/MVD | ---
arxiv: 1801.01681
dataset_info:
features:
- name: func
dtype: string
- name: path
dtype: string
- name: source
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 156793256
num_examples: 123515
- name: validation
num_bytes: 27720814
num_examples: 21797
- name: test
num_bytes: 45934658
num_examples: 36329
download_size: 69412844
dataset_size: 230448728
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
task_categories:
- text-classification
tags:
- code
---
This is an unofficial HuggingFace version of "[VulDeePecker: A Deep Learning-Based System for Vulnerability Detection
](https://arxiv.org/abs/1801.01681)" MVD dataset. See the [source files](https://github.com/muVulDeePecker/muVulDeePecker/tree/master/source%20files) for the relevant source code referred to by the path column.
There are 41 possible classes:
```
{
0: 'non-vulnerable',
1: 'CWE-404',
2: 'CWE-476',
3: 'CWE-119',
4: 'CWE-706',
5: 'CWE-670',
6: 'CWE-673',
7: 'CWE-119, CWE-666, CWE-573',
8: 'CWE-573',
9: 'CWE-668',
10: 'CWE-400, CWE-665, CWE-020',
11: 'CWE-662',
12: 'CWE-400',
13: 'CWE-665',
14: 'CWE-020',
15: 'CWE-074',
16: 'CWE-362',
17: 'CWE-191',
18: 'CWE-190',
19: 'CWE-610',
20: 'CWE-704',
21: 'CWE-170',
22: 'CWE-676',
23: 'CWE-187',
24: 'CWE-138',
25: 'CWE-369',
26: 'CWE-662, CWE-573',
27: 'CWE-834',
28: 'CWE-400, CWE-665',
29: 'CWE-400, CWE-404',
30: 'CWE-221',
31: 'CWE-754',
32: 'CWE-311',
33: 'CWE-404, CWE-668',
34: 'CWE-506',
35: 'CWE-758',
36: 'CWE-666',
37: 'CWE-467',
38: 'CWE-327',
39: 'CWE-666, CWE-573',
40: 'CWE-469'
}
```
***
# Multiclass Vulnerability Dataset (MVD)
MVD is a database for research on multiclass vulnerability detection with deep learning. The dataset is based on the NIST Software Assurance Reference Dataset (SARD) and National Vulnerability Database (NVD). Up to now, it has possessed 181641 code gadgets, covering 40 types of vulnerabilities. Each code gadget in MVD is composed of multiple program statements, which have direct or indirect data-dependence and control-dependence relationships with the library/API function calls. In total, the code gadgets in MVD are extracted from 33409 testcases of SARD and NVD, 138522 code gadgets of which are non-vulnerable and 43119 are vulnerable.
In this repository, the compressed file mvd.txt.zip stores 181641 code gadgets and their corresponding labels. The file named label2CWE.txt records the mapping relationship between each label and the corresponding vulnerability. The folder source files contains 33,409 source files for extracting code gadgets. |
FanChen0116/19100_chat_50x_slot_limit | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 580637
num_examples: 3200
- name: validation
num_bytes: 5405
num_examples: 32
- name: test
num_bytes: 5405
num_examples: 32
download_size: 0
dataset_size: 591447
---
# Dataset Card for "19100_chat_50x_slot_limit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hundred9/Duaaii_6 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
splits:
- name: train
num_bytes: 4878651.0
num_examples: 647
download_size: 4842183
dataset_size: 4878651.0
---
# Dataset Card for "Duaaii_6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/yui_swordartonline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yui (Sword Art Online)
This is the dataset of yui (Sword Art Online), containing 106 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
liuyanchen1015/MULTI_VALUE_stsb_null_genitive | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 26653
num_examples: 133
- name: test
num_bytes: 20508
num_examples: 99
- name: train
num_bytes: 121781
num_examples: 644
download_size: 120137
dataset_size: 168942
---
# Dataset Card for "MULTI_VALUE_stsb_null_genitive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_8_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 27104730
num_examples: 43998
download_size: 13635009
dataset_size: 27104730
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_8_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johannes-garstenauer/structs_token_size_4_one_heap | ---
dataset_info:
features:
- name: struct
dtype: string
splits:
- name: train
num_bytes: 346145
num_examples: 3175
download_size: 102623
dataset_size: 346145
---
# Dataset Card for "structs_token_size_4_one_heap"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
buddhist-nlp/skt-en-itihasa | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: target_text
dtype: string
splits:
- name: train
num_bytes: 34487575
num_examples: 68963
- name: validation
num_bytes: 455988
num_examples: 982
- name: test
num_bytes: 456167
num_examples: 838
download_size: 16921264
dataset_size: 35399730
---
# Dataset Card for "skt-en-itihasa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712827283 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 35783
num_examples: 85
download_size: 21223
dataset_size: 35783
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712827283"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mmarco_fr_dev | ---
pretty_name: '`mmarco/fr/dev`'
viewer: false
source_datasets: ['irds/mmarco_fr']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/fr/dev`
The `mmarco/fr/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/fr/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=101,093
- `qrels`: (relevance assessments); count=59,273
- For `docs`, use [`irds/mmarco_fr`](https://huggingface.co/datasets/irds/mmarco_fr)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_fr_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_fr_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
csitfun/LogiCoT | ---
license: cc-by-nc-nd-4.0
task_categories:
- text-generation
language:
- en
- zh
tags:
- instruction-finetuning
pretty_name: logicot
size_categories:
- 100K<n<1M
---
The instructions and demonstrations for building formal logical reasoning capable Generative Large Language models. CoT rationales are generated with the GPT-4 API.
> For non-commercial research purposes only.
Update: Our updated paper has been accepted by the findings of EMNLP2023.
The dataset is hosted on the Huggingface Datasets. It is the only distribution channel we currently allow. **You can download data examples from our Github [Link](https://github.com/csitfun/LogiCoT)**
**Important**: To request the dataset, please
1. Submit an access request through your huggingface account.
2. Send an email to Hanmeng Liu at hanhaishiyi@gmail.com. Please tell us your huggingface account username, your real name, org, and purpose. It would be best if you guaranteed that you will not share the data with others. We will approve your request after your info is provided.
Your access will be granted as soon as possible after email has been sent. Please come back and check in a couple of hours. Note that you might not receive a reply letter due to the frequent requests.
`general_inference.jsonl`: English instruction tuning data for the general inference task
`general_inference_pruned`: a pruned version with a smaller size while more diverse
`mrc.jsonl`: English instruction tuning data for the logical reading comprehension task
`mrc_zh.jsonl`: Chinese instruction tuning data for the logical reading comprehension task
`entailmentbank.jsonl`: derived from the EntailmentBank data
`folio2instruction.jsonl`: derived from the FOLIO data
For more information, please refer to our preview Arxiv eprint paper - [LogiCoT: Logical Chain-of-Thought Instruction-tuning Data Collection with GPT-4](https://arxiv.org/abs/2305.12147)
## Seminal Data
* LogicInference
* EntailmentBank
* FOLIO
* ReClor
* LogiQA
## Instruction types
### General inference task
* Language to Logic
* One-Step Inference
* Inference Chains
### Multi-choice reading comprehension task
* Identify the Necessary Claim
* Strengthen an Argument
* Weaken an Argument
* Resolve a Situation
* Identify a Flaw in Arguments Reasoning
## How to cite
```
@inproceedings{liu2023logicot,
title={LogiCoT: Logical Chain-of-Thought Instruction Tuning},
author={Liu, Hanmeng and Teng, Zhiyang and Cui, Leyang and Zhang, Chaoli and Zhou, Qiji and Zhang, Yue},
booktitle={Findings of the Association for Computational Linguistics: EMNLP 2023},
pages={2908--2921},
year={2023}
}
``` |
AdapterOcean/python3-standardized_cluster_6_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8339192
num_examples: 2838
download_size: 0
dataset_size: 8339192
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_6_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_reduced_relative | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 233739
num_examples: 1186
- name: test
num_bytes: 2220506
num_examples: 11410
- name: train
num_bytes: 2149800
num_examples: 10834
download_size: 2872278
dataset_size: 4604045
---
# Dataset Card for "MULTI_VALUE_qqp_reduced_relative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_medial_object_perfect | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: train
num_bytes: 2433
num_examples: 11
download_size: 3610
dataset_size: 2433
---
# Dataset Card for "MULTI_VALUE_wnli_medial_object_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_huggyllama__llama-65b | ---
pretty_name: Evaluation run of huggyllama/llama-65b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [huggyllama/llama-65b](https://huggingface.co/huggyllama/llama-65b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggyllama__llama-65b_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-07T09:32:32.801713](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-65b_public/blob/main/results_2023-11-07T09-32-32.801713.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.00039210421902984954,\n \"f1\": 0.05626468120805396,\n\
\ \"f1_stderr\": 0.0012002201848354834,\n \"acc\": 0.5989119618375836,\n\
\ \"acc_stderr\": 0.011990281632531736\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902984954,\n\
\ \"f1\": 0.05626468120805396,\n \"f1_stderr\": 0.0012002201848354834\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37225170583775585,\n \
\ \"acc_stderr\": 0.013315375362565038\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498433\n\
\ }\n}\n```"
repo_url: https://huggingface.co/huggyllama/llama-65b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_05T01_43_41.465043
path:
- '**/details_harness|drop|3_2023-11-05T01-43-41.465043.parquet'
- split: 2023_11_07T09_32_32.801713
path:
- '**/details_harness|drop|3_2023-11-07T09-32-32.801713.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-07T09-32-32.801713.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_05T01_43_41.465043
path:
- '**/details_harness|gsm8k|5_2023-11-05T01-43-41.465043.parquet'
- split: 2023_11_07T09_32_32.801713
path:
- '**/details_harness|gsm8k|5_2023-11-07T09-32-32.801713.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-07T09-32-32.801713.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_05T01_43_41.465043
path:
- '**/details_harness|winogrande|5_2023-11-05T01-43-41.465043.parquet'
- split: 2023_11_07T09_32_32.801713
path:
- '**/details_harness|winogrande|5_2023-11-07T09-32-32.801713.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-07T09-32-32.801713.parquet'
- config_name: results
data_files:
- split: 2023_11_05T01_43_41.465043
path:
- results_2023-11-05T01-43-41.465043.parquet
- split: 2023_11_07T09_32_32.801713
path:
- results_2023-11-07T09-32-32.801713.parquet
- split: latest
path:
- results_2023-11-07T09-32-32.801713.parquet
---
# Dataset Card for Evaluation run of huggyllama/llama-65b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggyllama/llama-65b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggyllama/llama-65b](https://huggingface.co/huggyllama/llama-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggyllama__llama-65b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-07T09:32:32.801713](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-65b_public/blob/main/results_2023-11-07T09-32-32.801713.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902984954,
"f1": 0.05626468120805396,
"f1_stderr": 0.0012002201848354834,
"acc": 0.5989119618375836,
"acc_stderr": 0.011990281632531736
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902984954,
"f1": 0.05626468120805396,
"f1_stderr": 0.0012002201848354834
},
"harness|gsm8k|5": {
"acc": 0.37225170583775585,
"acc_stderr": 0.013315375362565038
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498433
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pkyoyetera/luganda_english_dataset | ---
dataset_info:
features:
- name: English
dtype: string
- name: Luganda
dtype: string
splits:
- name: train
num_bytes: 11844863.620338032
num_examples: 78238
download_size: 7020236
dataset_size: 11844863.620338032
license: apache-2.0
task_categories:
- translation
language:
- en
- lg
size_categories:
- 10K<n<100K
---
# Dataset Card for "luganda_english_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Dataset might contain a few mistakes, espeecially on the one word translations. Indicators for verbs and nouns (v.i and n.i) may not have been completely filtered out properly. |
robert-altmiller/dolly-code-migration | ---
license: apache-2.0
language:
- en
tags:
- code
- dataset
pretty_name: dolly-code-migration
size_categories:
- n<1K
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.