datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
wlaminack/Nonlinearltestingdataset | ---
license: apache-2.0
---
def basic(array1):
x=(array1[0]-.5)
y=(array1[1]-.5)
z=(array1[2]-.5)
t=(array1[3]-.5)
r2=x*x+y*y+z*z+t*t
return 3*np.sin(r2)+np.random.random()*array1[4]
f=np.apply_along_axis(basic, 1, a) |
ayan1988/diffusion.interaction | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 8311379.0
num_examples: 105
download_size: 4666120
dataset_size: 8311379.0
---
# Dataset Card for "diffusion.interaction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
csujeong/KoAlpaca-v1.1a | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 23371027
num_examples: 21155
download_size: 12856014
dataset_size: 23371027
task_categories:
- text-generation
language:
- ko
tags:
- KoAlpaca
pretty_name: KoAlpaca-v1.1a
---
# Dataset Card for "KoAlpaca-v1.1a"
## Project Repo
- Github Repo: [Beomi/KoAlpaca](https://github.com/Beomi/KoAlpaca)
## How to use
```python
>>> from datasets import load_dataset
>>> ds = load_dataset("beomi/KoAlpaca-v1.1a", split="train")
>>> ds
Dataset({
features: ['instruction', 'input', 'output'],
num_rows: 21155
})
```
```python
>>> ds[0]
{'instruction': '양파는 어떤 식물 부위인가요? 그리고 고구마는 뿌리인가요?',
'output': '양파는 잎이 아닌 식물의 줄기 부분입니다. 고구마는 식물의 뿌리 부분입니다. \n\n식물의 부위의 구분에 대해 궁금해하는 분이라면 분명 이 질문에 대한 답을 찾고 있을 것입니다. 양파는 잎이 아닌 줄기 부분입니다. 고구마는 다른 질문과 답변에서 언급된 것과 같이 뿌리 부분입니다. 따라서, 양파는 식물의 줄기 부분이 되고, 고구마는 식물의 뿌리 부분입니다.\n\n 덧붙이는 답변: 고구마 줄기도 볶아먹을 수 있나요? \n\n고구마 줄기도 식용으로 볶아먹을 수 있습니다. 하지만 줄기 뿐만 아니라, 잎, 씨, 뿌리까지 모든 부위가 식용으로 활용되기도 합니다. 다만, 한국에서는 일반적으로 뿌리 부분인 고구마를 주로 먹습니다.',
'url': 'https://kin.naver.com/qna/detail.naver?d1id=11&dirId=1116&docId=55320268'}
``` |
CyberHarem/pps_43_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of pps_43/PPS-43/PPS-43 (Girls' Frontline)
This is the dataset of pps_43/PPS-43/PPS-43 (Girls' Frontline), containing 25 images and their tags.
The core tags of this character are `bangs, hair_ornament, pink_hair, braid, hat, long_hair, hairclip, red_eyes, very_long_hair, single_braid, pink_eyes, black_headwear, animal_ears, hair_between_eyes, thick_eyebrows`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 24.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pps_43_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 13.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pps_43_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 61 | 32.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pps_43_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 21.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pps_43_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 61 | 44.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pps_43_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pps_43_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, long_sleeves, simple_background, solo, striped_shirt, looking_at_viewer, white_background, pleated_skirt, russian_text, blush, jacket, closed_mouth, white_sailor_collar, holding, white_gloves, white_skirt, white_thighhighs, gun, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | simple_background | solo | striped_shirt | looking_at_viewer | white_background | pleated_skirt | russian_text | blush | jacket | closed_mouth | white_sailor_collar | holding | white_gloves | white_skirt | white_thighhighs | gun | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:-------|:----------------|:--------------------|:-------------------|:----------------|:---------------|:--------|:---------|:---------------|:----------------------|:----------|:---------------|:--------------|:-------------------|:------|:-------------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/bernadetta_von_varley_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of bernadetta_von_varley (Fire Emblem)
This is the dataset of bernadetta_von_varley (Fire Emblem), containing 354 images and their tags.
The core tags of this character are `purple_hair, short_hair, grey_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 354 | 356.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bernadetta_von_varley_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 354 | 218.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bernadetta_von_varley_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 708 | 435.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bernadetta_von_varley_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 354 | 322.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bernadetta_von_varley_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 708 | 601.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bernadetta_von_varley_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bernadetta_von_varley_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, garreg_mach_monastery_uniform, hood_down, open_mouth, simple_background, upper_body, long_sleeves, solo, wavy_mouth, white_background, ahoge, blush |
| 1 | 12 |  |  |  |  |  | 1girl, garreg_mach_monastery_uniform, holding_stuffed_toy, long_sleeves, simple_background, solo, teddy_bear, hood_down, upper_body, closed_mouth, blush, white_background, open_mouth |
| 2 | 13 |  |  |  |  |  | 1girl, garreg_mach_monastery_uniform, long_sleeves, solo, arrow_(projectile), holding_bow_(weapon), quiver, bike_shorts, boots, simple_background, white_background, open_mouth, closed_mouth, full_body, hood_down |
| 3 | 11 |  |  |  |  |  | 1girl, earrings, long_sleeves, solo, bike_shorts, hair_ornament, short_dress, simple_background, cleavage, open_mouth, white_background, yellow_gloves, quiver, arrow_(projectile), closed_mouth, holding_bow_(weapon), medium_breasts, small_breasts |
| 4 | 6 |  |  |  |  |  | 1girl, earrings, long_sleeves, solo, closed_mouth, dress, simple_background, blush, upper_body, white_background |
| 5 | 11 |  |  |  |  |  | 1girl, rabbit_ears, solo, blush, fake_animal_ears, hair_flower, looking_at_viewer, white_gloves, open_mouth, simple_background, official_alternate_costume, short_sleeves, white_background, dress, tail |
| 6 | 7 |  |  |  |  |  | maid_headdress, 1girl, long_sleeves, maid_apron, simple_background, blush, solo, enmaided, open_mouth |
| 7 | 10 |  |  |  |  |  | 1girl, nipples, hetero, open_mouth, penis, solo_focus, 1boy, blush, purple_eyes, vaginal, sex, small_breasts, cum_in_pussy, spread_legs, bar_censor, garreg_mach_monastery_uniform, long_sleeves, navel, nude, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | garreg_mach_monastery_uniform | hood_down | open_mouth | simple_background | upper_body | long_sleeves | solo | wavy_mouth | white_background | ahoge | blush | holding_stuffed_toy | teddy_bear | closed_mouth | arrow_(projectile) | holding_bow_(weapon) | quiver | bike_shorts | boots | full_body | earrings | hair_ornament | short_dress | cleavage | yellow_gloves | medium_breasts | small_breasts | dress | rabbit_ears | fake_animal_ears | hair_flower | looking_at_viewer | white_gloves | official_alternate_costume | short_sleeves | tail | maid_headdress | maid_apron | enmaided | nipples | hetero | penis | solo_focus | 1boy | purple_eyes | vaginal | sex | cum_in_pussy | spread_legs | bar_censor | navel | nude | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------------------|:------------|:-------------|:--------------------|:-------------|:---------------|:-------|:-------------|:-------------------|:--------|:--------|:----------------------|:-------------|:---------------|:---------------------|:-----------------------|:---------|:--------------|:--------|:------------|:-----------|:----------------|:--------------|:-----------|:----------------|:-----------------|:----------------|:--------|:--------------|:-------------------|:--------------|:--------------------|:---------------|:-----------------------------|:----------------|:-------|:-----------------|:-------------|:-----------|:----------|:---------|:--------|:-------------|:-------|:--------------|:----------|:------|:---------------|:--------------|:-------------|:--------|:-------|:--------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | X | X | X | X | | X | X | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | | | X | X | | X | X | | X | | | | | X | X | X | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | | X | X | X | X | | X | | X | | | X | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | | | X | X | | | X | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | X | X | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | X | | X | | | X | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
jeffboudier/argilla-news-summary | ---
language:
- en
license:
- cc-by-nc-4.0
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
task_ids:
- news-articles-summarization
dataset_info:
features:
- name: text
dtype: string
- name: prediction
list:
- name: score
dtype: float64
- name: text
dtype: string
- name: prediction_agent
dtype: string
- name: annotation
dtype: 'null'
- name: annotation_agent
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
- name: vectors
struct:
- name: mini-lm-sentence-transformers
sequence: float64
splits:
- name: train
num_bytes: 5537696
num_examples: 1000
download_size: 4137087
dataset_size: 5537696
duplicated_from: argilla/news-summary
---
# Dataset Card for "news-summary"
## Dataset Description
- **Homepage:** Kaggle Challenge
- **Repository:** https://www.kaggle.com/datasets/clmentbisaillon/fake-and-real-news-dataset?select=True.csv
- **Paper:** N.A.
- **Leaderboard:** N.A.
- **Point of Contact:** N.A.
### Dataset Summary
Can you use this data set to summarize news articles?
### Languages
english
### Citation Information
Acknowledgements
Ahmed H, Traore I, Saad S. “Detecting opinion spams and fake news using text classification”, Journal of Security and Privacy, Volume 1, Issue 1, Wiley, January/February 2018.
Ahmed H, Traore I, Saad S. (2017) “Detection of Online Fake News Using N-Gram Analysis and Machine Learning Techniques. In: Traore I., Woungang I., Awad A. (eds) Intelligent, Secure, and Dependable Systems in Distributed and Cloud Environments. ISDDC 2017. Lecture Notes in Computer Science, vol 10618. Springer, Cham (pp. 127-138).
### Contributions
Thanks to [@davidberenstein1957](https://github.com/davidberenstein1957) for adding this dataset. |
autoevaluate/autoeval-staging-eval-project-conll2003-e2bfcc2b-10665436 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: huggingface-course/bert-finetuned-ner
metrics: ['jordyvl/ece']
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: huggingface-course/bert-finetuned-ner
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jordyvl](https://huggingface.co/jordyvl) for evaluating this model. |
Nerfgun3/yor_forger | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/yor_forger/resolve/main/yor_forger_showcase.png"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Yor Forger Character Embedding / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/yor_forger/resolve/main/yor_forger_showcase.png"/>
## Disclaimer
This is an embedding based on the Anime Character Yor Forger from Spy x Family
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"yor_forger"```
Personally, I would recommend to use my embeddings with a strength of 0.8, like ```"(yor_forger:0.8)"```, but in this case the embedding basically works on almost all strength.
I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508"
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
DopeorNope/new_instruct5 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: tag
dtype: string
splits:
- name: train
num_bytes: 398396529
num_examples: 98033
download_size: 197717356
dataset_size: 398396529
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dell-research-harvard/associating-press | ---
license: cc-by-2.0
---
|
Hmoumad/Prepared-Dataset-Fine-Tune-Llama-2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 18679350
num_examples: 10000
download_size: 9958147
dataset_size: 18679350
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/random_letter_same_length_find_passage_train400_eval40_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 299411
num_examples: 840
- name: validation
num_bytes: 16031
num_examples: 40
download_size: 135104
dataset_size: 315442
---
# Dataset Card for "random_letter_same_length_find_passage_train400_eval40_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/math_dataset_standardized_cluster_4_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 7231825
num_examples: 10010
download_size: 3124973
dataset_size: 7231825
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "math_dataset_standardized_cluster_4_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
genta-tech/squad_pairs_indo | ---
license: cc-by-4.0
task_categories:
- question-answering
language:
- id
size_categories:
- 10K<n<100K
---
Stanford Question Answering Dataset (SQuAD) is a reading comprehension dataset, consisting of questions posed by crowdworkers on a set of Wikipedia articles, where the answer to every question is a segment of text, or span, from the corresponding reading passage, or the question might be unanswerable.
This is an Indonesia-translated version of [squad]("https://huggingface.co/datasets/squad") dataset
Translated from [sentence-transformers/embedding-training-data](https://huggingface.co/datasets/sentence-transformers/embedding-training-data)
Translated using [Helsinki-NLP/EN-ID](https://huggingface.co/Helsinki-NLP/opus-mt-en-id) |
KaiLv/UDR_ComE | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: label
dtype: string
- name: question
dtype: string
- name: choices
dtype: string
- name: len_question
dtype: int64
- name: max_len_choices
dtype: int64
splits:
- name: train
num_bytes: 4855852
num_examples: 9996
- name: test
num_bytes: 468814
num_examples: 1000
- name: debug
num_bytes: 2432484
num_examples: 5000
download_size: 3748196
dataset_size: 7757150
---
# Dataset Card for "UDR_ComE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shapermindai/huggingchat | ---
license: apache-2.0
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-100000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1055027
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
librawav/cranedying | ---
license: openrail
---
|
mor40/oscar-tokenized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: word_ids
sequence: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 12617925936
num_examples: 2326361
download_size: 3746262218
dataset_size: 12617925936
---
# Dataset Card for "oscar-tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/gpteacher-instruct-chatml | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 11767161
num_examples: 18194
download_size: 0
dataset_size: 11767161
---
# Dataset Card for "gpteacher-instruct-chatml"
Data preprocessing pipeline: https://github.com/AlekseyKorshuk/chat-data-pipeline |
flydust/SafeDecoding-Attackers | ---
license: mit
extra_gated_prompt: "You agree to not use the dataset to conduct experiments that cause harm to human subjects."
extra_gated_fields:
Company: text
Country: country
Specific date: date_picker
I want to use this dataset for:
type: select
options:
- Research
- label: Other
value: other
I agree to use this dataset for research use ONLY: checkbox
---
## Dataset Details
This dataset contains attack prompts generated from GCG, AutoDAN, PAIR, and DeepInception for **research use ONLY**.
## Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [https://github.com/uw-nsl/SafeDecoding](https://github.com/uw-nsl/SafeDecoding)
- **Paper:** [https://arxiv.org/abs/2402.08983](https://arxiv.org/abs/2402.08983)
|
zhangyi617/AE_adversarial_train | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 78132137.0
num_examples: 180
download_size: 78133186
dataset_size: 78132137.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kielzim/cava | ---
license: openrail
---
|
Raziullah/asr_finetune_hi | ---
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
splits:
- name: train
num_bytes: 195953441.96
num_examples: 6760
- name: test
num_bytes: 100396559.039
num_examples: 2947
download_size: 268823630
dataset_size: 296350000.999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/carnelian_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of carnelian/カーネリアン/卡涅利安 (Arknights)
This is the dataset of carnelian/カーネリアン/卡涅利安 (Arknights), containing 321 images and their tags.
The core tags of this character are `dark-skinned_female, dark_skin, horns, animal_ears, goat_horns, goat_ears, goat_girl, white_hair, short_hair, red_eyes, breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 321 | 619.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carnelian_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 321 | 509.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carnelian_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 858 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/carnelian_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/carnelian_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, smile, solo, white_jacket, white_shirt, black_gloves, white_background, simple_background, black_skirt, open_jacket, cleavage, medium_breasts, parted_lips, half_gloves, cowboy_shot, miniskirt, upper_body, white_coat |
| 1 | 37 |  |  |  |  |  | 1girl, long_sleeves, solo, white_shirt, black_skirt, holding_sword, looking_at_viewer, black_gloves, white_coat, thigh_strap, white_jacket, medium_breasts, smile, cowboy_shot, miniskirt, cleavage, standing, closed_mouth, infection_monitor_(arknights), open_coat, open_jacket |
| 2 | 8 |  |  |  |  |  | 1girl, black_coat, black_hairband, fur_trim, official_alternate_costume, solo, looking_at_viewer, upper_body, black_jacket, long_sleeves, smile, white_gloves, closed_mouth, grey_hair, sweater, white_background |
| 3 | 12 |  |  |  |  |  | black_jacket, infection_monitor_(arknights), open_jacket, white_dress, yellow_eyes, 1girl, black_collar, long_hair, long_sleeves, black_coat, hair_between_eyes, holding_staff, looking_at_viewer, simple_background, solo, open_coat, upper_body, white_background, white_shirt, standing |
| 4 | 8 |  |  |  |  |  | 1girl, bare_shoulders, black_one-piece_swimsuit, casual_one-piece_swimsuit, looking_at_viewer, official_alternate_costume, solo, necklace, simple_background, smile, closed_mouth, white_background, cleavage, clothing_cutout, covered_navel, thighs, upper_body |
| 5 | 15 |  |  |  |  |  | 1girl, bare_shoulders, black_one-piece_swimsuit, casual_one-piece_swimsuit, looking_at_viewer, necklace, official_alternate_costume, solo, smile, grey_hair, cleavage, sitting, clothing_cutout, closed_mouth, covered_navel, feet_out_of_frame, parted_lips, thigh_strap, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | looking_at_viewer | smile | solo | white_jacket | white_shirt | black_gloves | white_background | simple_background | black_skirt | open_jacket | cleavage | medium_breasts | parted_lips | half_gloves | cowboy_shot | miniskirt | upper_body | white_coat | holding_sword | thigh_strap | standing | closed_mouth | infection_monitor_(arknights) | open_coat | black_coat | black_hairband | fur_trim | official_alternate_costume | black_jacket | white_gloves | grey_hair | sweater | white_dress | yellow_eyes | black_collar | long_hair | hair_between_eyes | holding_staff | bare_shoulders | black_one-piece_swimsuit | casual_one-piece_swimsuit | necklace | clothing_cutout | covered_navel | thighs | sitting | feet_out_of_frame |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:--------|:-------|:---------------|:--------------|:---------------|:-------------------|:--------------------|:--------------|:--------------|:-----------|:-----------------|:--------------|:--------------|:--------------|:------------|:-------------|:-------------|:----------------|:--------------|:-----------|:---------------|:--------------------------------|:------------|:-------------|:-----------------|:-----------|:-----------------------------|:---------------|:---------------|:------------|:----------|:--------------|:--------------|:---------------|:------------|:--------------------|:----------------|:-----------------|:---------------------------|:----------------------------|:-----------|:------------------|:----------------|:---------|:----------|:--------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 37 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | X | X | X | | | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | | | | X | | | | | | | | | | X | | | | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | X | X | | X | | X | | X | X | | X | | | | | | | X | | | | X | | X | X | X | | | | X | | | | X | X | X | X | X | X | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | X | X | X | | | | X | X | | | X | | | | | | X | | | | | X | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | |
| 5 | 15 |  |  |  |  |  | X | | X | X | X | | | | | | | | X | | X | | | | | | | X | | X | | | | | | X | | | X | | | | | | | | X | X | X | X | X | X | X | X | X |
|
liuyanchen1015/MULTI_VALUE_stsb_for_to_pupose | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 5493
num_examples: 26
- name: test
num_bytes: 5085
num_examples: 30
- name: train
num_bytes: 18357
num_examples: 105
download_size: 28958
dataset_size: 28935
---
# Dataset Card for "MULTI_VALUE_stsb_for_to_pupose"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/clueweb12_b13_clef-ehealth_fr | ---
pretty_name: '`clueweb12/b13/clef-ehealth/fr`'
viewer: false
source_datasets: ['irds/clueweb12_b13']
task_categories:
- text-retrieval
---
# Dataset Card for `clueweb12/b13/clef-ehealth/fr`
The `clueweb12/b13/clef-ehealth/fr` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/clueweb12#clueweb12/b13/clef-ehealth/fr).
# Data
This dataset provides:
- `queries` (i.e., topics); count=300
- `qrels`: (relevance assessments); count=269,232
- For `docs`, use [`irds/clueweb12_b13`](https://huggingface.co/datasets/irds/clueweb12_b13)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/clueweb12_b13_clef-ehealth_fr', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/clueweb12_b13_clef-ehealth_fr', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'trustworthiness': ..., 'understandability': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Zuccon2016ClefEhealth,
title={The IR Task at the CLEF eHealth Evaluation Lab 2016: User-centred Health Information Retrieval},
author={Guido Zuccon and Joao Palotti and Lorraine Goeuriot and Liadh Kelly and Mihai Lupu and Pavel Pecina and Henning M{\"u}ller and Julie Budaher and Anthony Deacon},
booktitle={CLEF},
year={2016}
}
@inproceedings{Palotti2017ClefEhealth,
title={CLEF 2017 Task Overview: The IR Task at the eHealth Evaluation Lab - Evaluating Retrieval Methods for Consumer Health Search},
author={Joao Palotti and Guido Zuccon and Jimmy and Pavel Pecina and Mihai Lupu and Lorraine Goeuriot and Liadh Kelly and Allan Hanbury},
booktitle={CLEF},
year={2017}
}
```
|
heliosprime/twitter_dataset_1713015646 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9508
num_examples: 23
download_size: 9132
dataset_size: 9508
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713015646"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_264 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1017407060.0
num_examples: 199805
download_size: 1038357200
dataset_size: 1017407060.0
---
# Dataset Card for "chunk_264"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
g30rv17ys/tjkicksmel | ---
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 223991.0
num_examples: 100
download_size: 224647
dataset_size: 223991.0
---
# Dataset Card for "tjkicksmel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_llama-anon__instruct-13b | ---
pretty_name: Evaluation run of llama-anon/instruct-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llama-anon/instruct-13b](https://huggingface.co/llama-anon/instruct-13b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llama-anon__instruct-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T02:24:06.962063](https://huggingface.co/datasets/open-llm-leaderboard/details_llama-anon__instruct-13b/blob/main/results_2023-09-17T02-24-06.962063.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31438758389261745,\n\
\ \"em_stderr\": 0.004754574768123327,\n \"f1\": 0.3769809144295322,\n\
\ \"f1_stderr\": 0.004680725874888402,\n \"acc\": 0.37917019961428294,\n\
\ \"acc_stderr\": 0.00825067276736675\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.31438758389261745,\n \"em_stderr\": 0.004754574768123327,\n\
\ \"f1\": 0.3769809144295322,\n \"f1_stderr\": 0.004680725874888402\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \
\ \"acc_stderr\": 0.004106620637749704\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n\
\ }\n}\n```"
repo_url: https://huggingface.co/llama-anon/instruct-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T02_24_06.962063
path:
- '**/details_harness|drop|3_2023-09-17T02-24-06.962063.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T02-24-06.962063.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T02_24_06.962063
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-24-06.962063.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-24-06.962063.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:36.816075.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:48:36.816075.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:48:36.816075.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T02_24_06.962063
path:
- '**/details_harness|winogrande|5_2023-09-17T02-24-06.962063.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T02-24-06.962063.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_48_36.816075
path:
- results_2023-07-19T18:48:36.816075.parquet
- split: 2023_09_17T02_24_06.962063
path:
- results_2023-09-17T02-24-06.962063.parquet
- split: latest
path:
- results_2023-09-17T02-24-06.962063.parquet
---
# Dataset Card for Evaluation run of llama-anon/instruct-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llama-anon/instruct-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llama-anon/instruct-13b](https://huggingface.co/llama-anon/instruct-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llama-anon__instruct-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T02:24:06.962063](https://huggingface.co/datasets/open-llm-leaderboard/details_llama-anon__instruct-13b/blob/main/results_2023-09-17T02-24-06.962063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.31438758389261745,
"em_stderr": 0.004754574768123327,
"f1": 0.3769809144295322,
"f1_stderr": 0.004680725874888402,
"acc": 0.37917019961428294,
"acc_stderr": 0.00825067276736675
},
"harness|drop|3": {
"em": 0.31438758389261745,
"em_stderr": 0.004754574768123327,
"f1": 0.3769809144295322,
"f1_stderr": 0.004680725874888402
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.004106620637749704
},
"harness|winogrande|5": {
"acc": 0.7355958958168903,
"acc_stderr": 0.012394724896983799
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Umbaji/umni_speech | ---
license: mit
---
|
mightyjoe87/myfitcoachdata1 | ---
license: apache-2.0
---
|
Bsbell21/genaidata3 | ---
dataset_info:
features:
- name: item
dtype: string
- name: description
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 890
num_examples: 5
download_size: 3305
dataset_size: 890
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "genaidata3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alexoamber/testing | ---
license: afl-3.0
---
|
adriana98/medical_spanish | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 47271135.0
num_examples: 168
- name: test
num_bytes: 9986915.0
num_examples: 43
download_size: 57137009
dataset_size: 57258050.0
---
|
EPFL-LNMC/channelpedia_paper_tagged | ---
dataset_info:
features:
- name: metadata
struct:
- name: section
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: tags
sequence: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 126163106
num_examples: 30926
download_size: 127902726
dataset_size: 126163106
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "channelpedia_paper_tagged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
income/cqadupstack-tex-top-20-gen-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
---
# NFCorpus: 20 generated queries (BEIR Benchmark)
This HF dataset contains the top-20 synthetic queries generated for each passage in the above BEIR benchmark dataset.
- DocT5query model used: [BeIR/query-gen-msmarco-t5-base-v1](https://huggingface.co/BeIR/query-gen-msmarco-t5-base-v1)
- id (str): unique document id in NFCorpus in the BEIR benchmark (`corpus.jsonl`).
- Questions generated: 20
- Code used for generation: [evaluate_anserini_docT5query_parallel.py](https://github.com/beir-cellar/beir/blob/main/examples/retrieval/evaluation/sparse/evaluate_anserini_docT5query_parallel.py)
Below contains the old dataset card for the BEIR benchmark.
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.Top-20 generated queries for every passage in NFCorpus
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
matijao/webdev | ---
license: mit
---
|
GlowingBrick/venv-me | ---
license: mit
---
|
Kotokin/artisticBIG | ---
license: unknown
---
|
irds/clueweb09 | ---
pretty_name: '`clueweb09`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `clueweb09`
The `clueweb09` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/clueweb09#clueweb09).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=1,040,859,705
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/clueweb09', 'docs')
for record in docs:
record # {'doc_id': ..., 'url': ..., 'date': ..., 'http_headers': ..., 'body': ..., 'body_content_type': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
mstz/heloc | ---
language:
- en
tags:
- heloc
- fico
- tabular_classification
- binary_classification
pretty_name: Heloc
size_categories:
- 10K<n<100K
task_categories:
- tabular-classification
configs:
- risk
license: cc
---
# HELOC
The [HELOC dataset](https://community.fico.com/s/explainable-machine-learning-challenge?tabset-158d9=d157e) from FICO.
Each entry in the dataset is a line of credit, typically offered by a bank as a percentage of home equity (the difference between the current market value of a home and its purchase price).
The customers in this dataset have requested a credit line in the range of $5,000 - $150,000.
The fundamental task is to use the information about the applicant in their credit report to predict whether they will repay their HELOC account within 2 years.
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|-----------------------------------------------------------------|
| risk | Binary classification | Will the customer default? |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/heloc")["train"]
```
# Features
|**Feature** |**Type**|
|-------------------------------------------|--------|
|`estimate_of_risk` |`int8` |
|`months_since_first_trade` |`int32` |
|`months_since_last_trade` |`int32` |
|`average_duration_of_resolution` |`int32` |
|`number_of_satisfactory_trades` |`int16` |
|`nr_trades_insolvent_for_over_60_days` |`int16` |
|`nr_trades_insolvent_for_over_90_days` |`int16` |
|`percentage_of_legal_trades` |`int16` |
|`months_since_last_illegal_trade` |`int32` |
|`maximum_illegal_trades_over_last_year` |`int8` |
|`maximum_illegal_trades` |`int16` |
|`nr_total_trades` |`int16` |
|`nr_trades_initiated_in_last_year` |`int16` |
|`percentage_of_installment_trades` |`int16` |
|`months_since_last_inquiry_not_recent` |`int16` |
|`nr_inquiries_in_last_6_months` |`int16` |
|`nr_inquiries_in_last_6_months_not_recent` |`int16` |
|`net_fraction_of_revolving_burden` |`int32` |
|`net_fraction_of_installment_burden` |`int32` |
|`nr_revolving_trades_with_balance` |`int16` |
|`nr_installment_trades_with_balance` |`int16` |
|`nr_banks_with_high_ratio` |`int16` |
|`percentage_trades_with_balance` |`int16` | |
open-llm-leaderboard/details_Taekyoon__llama2-ko-7b-test | ---
pretty_name: Evaluation run of Taekyoon/llama2-ko-7b-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Taekyoon/llama2-ko-7b-test](https://huggingface.co/Taekyoon/llama2-ko-7b-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Taekyoon__llama2-ko-7b-test\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-28T09:09:02.494936](https://huggingface.co/datasets/open-llm-leaderboard/details_Taekyoon__llama2-ko-7b-test/blob/main/results_2023-08-28T09%3A09%3A02.494936.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.29939516770443486,\n \"\
acc_stderr\": 0.03301601100044412,\n \"acc_norm\": 0.3025804071569399,\n\
\ \"acc_norm_stderr\": 0.03301705555289615,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023505,\n \"mc2\": 0.35997350772493825,\n\
\ \"mc2_stderr\": 0.014042588070598837\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3506825938566553,\n \"acc_stderr\": 0.013944635930726089,\n\
\ \"acc_norm\": 0.3779863481228669,\n \"acc_norm_stderr\": 0.014169664520303105\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4697271459868552,\n\
\ \"acc_stderr\": 0.004980627287147587,\n \"acc_norm\": 0.6303525194184425,\n\
\ \"acc_norm_stderr\": 0.004817227292240292\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3320754716981132,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.3320754716981132,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\
\ \"acc_stderr\": 0.034140140070440354,\n \"acc_norm\": 0.2774566473988439,\n\
\ \"acc_norm_stderr\": 0.034140140070440354\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237657,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237657\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918424,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.032006864972873916,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.032006864972873916\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23225806451612904,\n\
\ \"acc_stderr\": 0.02402225613030824,\n \"acc_norm\": 0.23225806451612904,\n\
\ \"acc_norm_stderr\": 0.02402225613030824\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114485,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.37373737373737376,\n \"acc_stderr\": 0.034468977386593325,\n \"\
acc_norm\": 0.37373737373737376,\n \"acc_norm_stderr\": 0.034468977386593325\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32051282051282054,\n \"acc_stderr\": 0.023661296393964273,\n\
\ \"acc_norm\": 0.32051282051282054,\n \"acc_norm_stderr\": 0.023661296393964273\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978086,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978086\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598096,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598096\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3357798165137615,\n \"acc_stderr\": 0.020248081396752937,\n \"\
acc_norm\": 0.3357798165137615,\n \"acc_norm_stderr\": 0.020248081396752937\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501954,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501954\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3053435114503817,\n \"acc_stderr\": 0.040393149787245626,\n\
\ \"acc_norm\": 0.3053435114503817,\n \"acc_norm_stderr\": 0.040393149787245626\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.038342410214190735,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.038342410214190735\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.04777615181156739,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.04777615181156739\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32051282051282054,\n\
\ \"acc_stderr\": 0.03057281131029961,\n \"acc_norm\": 0.32051282051282054,\n\
\ \"acc_norm_stderr\": 0.03057281131029961\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29246487867177523,\n\
\ \"acc_stderr\": 0.016267000684598645,\n \"acc_norm\": 0.29246487867177523,\n\
\ \"acc_norm_stderr\": 0.016267000684598645\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808836,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808836\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3408360128617363,\n\
\ \"acc_stderr\": 0.02692084126077616,\n \"acc_norm\": 0.3408360128617363,\n\
\ \"acc_norm_stderr\": 0.02692084126077616\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.30246913580246915,\n \"acc_stderr\": 0.025557653981868055,\n\
\ \"acc_norm\": 0.30246913580246915,\n \"acc_norm_stderr\": 0.025557653981868055\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290396,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.288135593220339,\n\
\ \"acc_stderr\": 0.011567140661324563,\n \"acc_norm\": 0.288135593220339,\n\
\ \"acc_norm_stderr\": 0.011567140661324563\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.35454545454545455,\n \"acc_stderr\": 0.04582004841505416,\n\
\ \"acc_norm\": 0.35454545454545455,\n \"acc_norm_stderr\": 0.04582004841505416\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.40408163265306124,\n\
\ \"acc_stderr\": 0.031414708025865885,\n \"acc_norm\": 0.40408163265306124,\n\
\ \"acc_norm_stderr\": 0.031414708025865885\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916714,\n\
\ \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916714\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.30120481927710846,\n \"acc_stderr\": 0.0357160923005348,\n\
\ \"acc_norm\": 0.30120481927710846,\n \"acc_norm_stderr\": 0.0357160923005348\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.03565079670708312,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.03565079670708312\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023505,\n\
\ \"mc2\": 0.35997350772493825,\n \"mc2_stderr\": 0.014042588070598837\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Taekyoon/llama2-ko-7b-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|arc:challenge|25_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hellaswag|10_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:09:02.494936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T09:09:02.494936.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T09:09:02.494936.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T09:09:02.494936.parquet'
- config_name: results
data_files:
- split: 2023_08_28T09_09_02.494936
path:
- results_2023-08-28T09:09:02.494936.parquet
- split: latest
path:
- results_2023-08-28T09:09:02.494936.parquet
---
# Dataset Card for Evaluation run of Taekyoon/llama2-ko-7b-test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Taekyoon/llama2-ko-7b-test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Taekyoon/llama2-ko-7b-test](https://huggingface.co/Taekyoon/llama2-ko-7b-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Taekyoon__llama2-ko-7b-test",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T09:09:02.494936](https://huggingface.co/datasets/open-llm-leaderboard/details_Taekyoon__llama2-ko-7b-test/blob/main/results_2023-08-28T09%3A09%3A02.494936.json):
```python
{
"all": {
"acc": 0.29939516770443486,
"acc_stderr": 0.03301601100044412,
"acc_norm": 0.3025804071569399,
"acc_norm_stderr": 0.03301705555289615,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023505,
"mc2": 0.35997350772493825,
"mc2_stderr": 0.014042588070598837
},
"harness|arc:challenge|25": {
"acc": 0.3506825938566553,
"acc_stderr": 0.013944635930726089,
"acc_norm": 0.3779863481228669,
"acc_norm_stderr": 0.014169664520303105
},
"harness|hellaswag|10": {
"acc": 0.4697271459868552,
"acc_stderr": 0.004980627287147587,
"acc_norm": 0.6303525194184425,
"acc_norm_stderr": 0.004817227292240292
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3320754716981132,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.3320754716981132,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.034140140070440354,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.034140140070440354
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237657,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237657
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918424,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.032006864972873916,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.032006864972873916
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.02402225613030824,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.02402225613030824
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114485,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.37373737373737376,
"acc_stderr": 0.034468977386593325,
"acc_norm": 0.37373737373737376,
"acc_norm_stderr": 0.034468977386593325
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.023661296393964273,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.023661296393964273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978086,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978086
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598096,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598096
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3357798165137615,
"acc_stderr": 0.020248081396752937,
"acc_norm": 0.3357798165137615,
"acc_norm_stderr": 0.020248081396752937
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3053435114503817,
"acc_stderr": 0.040393149787245626,
"acc_norm": 0.3053435114503817,
"acc_norm_stderr": 0.040393149787245626
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.038342410214190735,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.038342410214190735
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.04777615181156739,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.04777615181156739
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.03057281131029961,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.03057281131029961
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29246487867177523,
"acc_stderr": 0.016267000684598645,
"acc_norm": 0.29246487867177523,
"acc_norm_stderr": 0.016267000684598645
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808836,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808836
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3408360128617363,
"acc_stderr": 0.02692084126077616,
"acc_norm": 0.3408360128617363,
"acc_norm_stderr": 0.02692084126077616
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30246913580246915,
"acc_stderr": 0.025557653981868055,
"acc_norm": 0.30246913580246915,
"acc_norm_stderr": 0.025557653981868055
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290396,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.288135593220339,
"acc_stderr": 0.011567140661324563,
"acc_norm": 0.288135593220339,
"acc_norm_stderr": 0.011567140661324563
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.031414708025865885,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.031414708025865885
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03565079670708312,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03565079670708312
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023505,
"mc2": 0.35997350772493825,
"mc2_stderr": 0.014042588070598837
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_stsb_a_participle | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 82768
num_examples: 410
- name: test
num_bytes: 51866
num_examples: 271
- name: train
num_bytes: 273780
num_examples: 1421
download_size: 269468
dataset_size: 408414
---
# Dataset Card for "MULTI_VALUE_stsb_a_participle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KrisPi/PythonTutor-Evol-1k-DPO-GPT4_vs_35 | ---
license: cc-by-nc-sa-4.0
language:
- en
size_categories:
- n<1K
---
Started with:
https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1
(GPT-3.5 Turbo)
Randomly selected 1000 where output contained "```python" in output
Generated GPT-4 answers to those for the sake of LIMA-like "Python Tutor" Instruct fine-tuning as well as validate DPO Fine-Tuning (where GPT-4 answers will be preferred to GPT-3.5 Turbo)
Then filtered refusals (looking for "impossible" or "sorry")
GPT-4 System Prompt:
You are an intelligent assistant that generates Python code. Start generation with ```python and end with ``` and nothing else. Just content between ```python and ```. The generated code should be wrapped in triple backticks and language identifier. Each line of code should be accompanied by a comment explaining it, and every function definition should be followed by a docstring describing the function, solution approach, and any edge cases considered. Try to wrap code in a function. |
kuanhuggingface/hint-lm-data | ---
dataset_info:
features:
- name: question
dtype: string
- name: options
sequence: string
- name: answer
dtype: string
- name: prompt
struct:
- name: Analyze the given information, break down the problem into manageable
steps, apply suitable mathematical operations, and provide a clear, accurate,
and concise solution, ensuring precise rounding if necessary. Consider all
variables and carefully consider the problem’s context for an efficient solution.
dtype: string
- name: Answer Directly.
dtype: string
- name: Break this down.
dtype: string
- name: Embrace challenges as opportunities for growth. Each obstacle you overcome
brings you closer to success.
dtype: string
- name: Let’s be realistic and think step by step.
dtype: string
- name: Let’s solve this problem by splitting it into steps.
dtype: string
- name: Let’s think about this logically.
dtype: string
- name: Let’s think like a detective step by step.
dtype: string
- name: Let’s think step by step.
dtype: string
- name: Let’s work this out in a step by step way to be sure we have the right
answer.
dtype: string
- name: 'Let’s work through this problem step-by-step:'
dtype: string
- name: Question decomposition.
dtype: string
- name: Remember that progress is made one step at a time. Stay determined and
keep moving forward.
dtype: string
- name: Stay focused and dedicated to your goals. Your consistent efforts will
lead to outstanding achievements.
dtype: string
- name: Take a deep breath and work on this problem step-by-step.
dtype: string
- name: Take a deep breath and work on this problem.
dtype: string
- name: Take pride in your work and give it your best. Your commitment to excellence
sets you apart.
dtype: string
- name: This is very important to my career.
dtype: string
- name: Write your answer and give me a confidence score between 0-1 for your
answer.
dtype: string
- name: You have to solve this problem, I am in trouble.
dtype: string
- name: You'd better be sure.
dtype: string
splits:
- name: hotpotqa_train
num_bytes: 94526339
num_examples: 5481
- name: hotpotqa_validation
num_bytes: 7987679
num_examples: 458
- name: openbookqa_train
num_bytes: 132616921
num_examples: 4957
- name: openbookqa_validation
num_bytes: 13925080
num_examples: 500
- name: openbookqa_test
num_bytes: 14024852
num_examples: 500
- name: strategyqa_train
num_bytes: 51961161
num_examples: 1790
- name: strategyqa_full
num_bytes: 66362783
num_examples: 2290
- name: strategyqa_test
num_bytes: 14398008
num_examples: 500
- name: truthfulqa_train
num_bytes: 9563847
num_examples: 317
- name: truthfulqa_full
num_bytes: 24894176
num_examples: 817
- name: truthfulqa_test
num_bytes: 15328531
num_examples: 500
download_size: 211054466
dataset_size: 445589377
configs:
- config_name: default
data_files:
- split: hotpotqa_train
path: data/hotpotqa_train-*
- split: hotpotqa_validation
path: data/hotpotqa_validation-*
- split: openbookqa_train
path: data/openbookqa_train-*
- split: openbookqa_validation
path: data/openbookqa_validation-*
- split: openbookqa_test
path: data/openbookqa_test-*
- split: strategyqa_train
path: data/strategyqa_train-*
- split: strategyqa_full
path: data/strategyqa_full-*
- split: strategyqa_test
path: data/strategyqa_test-*
- split: truthfulqa_train
path: data/truthfulqa_train-*
- split: truthfulqa_full
path: data/truthfulqa_full-*
- split: truthfulqa_test
path: data/truthfulqa_test-*
---
|
kaydee/wildreceipt | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: string
- name: words
sequence: string
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': Ignore
'1': Store_name_value
'2': Store_name_key
'3': Store_addr_value
'4': Store_addr_key
'5': Tel_value
'6': Tel_key
'7': Date_value
'8': Date_key
'9': Time_value
'10': Time_key
'11': Prod_item_value
'12': Prod_item_key
'13': Prod_quantity_value
'14': Prod_quantity_key
'15': Prod_price_value
'16': Prod_price_key
'17': Subtotal_value
'18': Subtotal_key
'19': Tax_value
'20': Tax_key
'21': Tips_value
'22': Tips_key
'23': Total_value
'24': Total_key
'25': Others
splits:
- name: train
num_bytes: 897479122.514
num_examples: 1267
- name: test
num_bytes: 428203797.0
num_examples: 472
download_size: 1367650317
dataset_size: 1325682919.514
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
marmofayezi/CelebAPair | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: image_hq
dtype: image
- name: mask
dtype: image
- name: captions
sequence: string
splits:
- name: train
num_bytes: 3540059470.3
num_examples: 28495
- name: test
num_bytes: 213012020.892
num_examples: 1498
download_size: 3736488775
dataset_size: 3753071491.1920004
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
argilla/mistral-vs-llama-rating | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for mistral-vs-llama-rating
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("argilla/mistral-vs-llama-rating")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("argilla/mistral-vs-llama-rating")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/guides/llms/conceptual_guides/data_model.html) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| input | Input | TextField | True | False |
| response_1 | Response_1 | TextField | True | True |
| response_2 | Response_2 | TextField | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, single choice, or multiple choice.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| rating_1 | Rate the response_1? | RatingQuestion | True | N/A | [1, 2, 3, 4, 5] |
| rationale_1 | Rationale behind response_1's ranking? | TextQuestion | True | N/A | N/A |
| rating_2 | Rate the response_2? | RatingQuestion | True | N/A | [1, 2, 3, 4, 5] |
| rationale_2 | Rationale behind response_2's ranking? | TextQuestion | True | N/A | N/A |
**✨ NEW** Additionally, we also have **suggestions**, which are linked to the existing questions, and so on, named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above.
Finally, the **guidelines** are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"fields": {
"input": "How do I locate my card?",
"response_1": " Thank you for reaching out! I\u0027m here to help you with your query. However, I must point out that the term \"locate\" can have different meanings depending on the context. Could you please provide more details or clarify what you mean by \"locate\"? Are you trying to find a specific card, such as a credit card or debit card? Or perhaps you\u0027re looking for something else entirely? Once I have a better understanding of your question, I\u0027ll be happy to assist you in a responsible and accurate manner.",
"response_2": " Of course! I\u0027d be happy to help you locate your missing card. To start, could you please tell me more about the card you\u0027re trying to find? Is it a credit card, debit card, or another type of card? Additionally, can you provide more details about the circumstances surrounding its disappearance? For example, did you lose it at home, in a store, or somewhere else? Any additional information you can provide will help me give you more tailored advice on how to locate your card."
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": "39bdc103-0e37-4515-a353-abe3b68bdebd",
"values": {
"rating_1": {
"value": 3
},
"rating_2": {
"value": 4
},
"rationale_1": {
"value": "The text is correct and provides a helpful response to the query. However, it lacks specific instructions or steps to help the user locate their card, hence it\u0027s not highly informative."
},
"rationale_2": {
"value": "The text is accurate and provides a more detailed response to the query. It asks for more specific information to provide a tailored solution, making it highly informative. However, it still lacks the profound insights or comprehensive information that would make it outstandingly helpful."
}
}
}
],
"suggestions": [
{
"agent": "gpt-4",
"question_id": "0a3cf946-2bd3-4d57-b42c-caddf8cc4979",
"question_name": "rating_1",
"value": 3
},
{
"agent": "gpt-4",
"question_id": "d7d8347b-554a-4eaf-ad2b-64898f3e0841",
"question_name": "rationale_1",
"value": "The text is correct and provides a helpful response to the query. However, it lacks specific instructions or steps to help the user locate their card, hence it\u0027s not highly informative."
},
{
"agent": "gpt-4",
"question_id": "99672a09-0d1c-45d7-ac2c-fe692603c5b4",
"question_name": "rating_2",
"value": 4
},
{
"agent": "gpt-4",
"question_id": "8276bff4-1106-4fef-abcb-ebc70beb791e",
"question_name": "rationale_2",
"value": "The text is accurate and provides a more detailed response to the query. It asks for more specific information to provide a tailored solution, making it highly informative. However, it still lacks the profound insights or comprehensive information that would make it outstandingly helpful."
}
]
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"input": "How do I locate my card?",
"metadata": "{}",
"rating_1": [
{
"status": "submitted",
"user_id": "39bdc103-0e37-4515-a353-abe3b68bdebd",
"value": 3
}
],
"rating_1-suggestion": 3,
"rating_1-suggestion-metadata": {
"agent": "gpt-4",
"score": null,
"type": null
},
"rating_2": [
{
"status": "submitted",
"user_id": "39bdc103-0e37-4515-a353-abe3b68bdebd",
"value": 4
}
],
"rating_2-suggestion": 4,
"rating_2-suggestion-metadata": {
"agent": "gpt-4",
"score": null,
"type": null
},
"rationale_1": [
{
"status": "submitted",
"user_id": "39bdc103-0e37-4515-a353-abe3b68bdebd",
"value": "The text is correct and provides a helpful response to the query. However, it lacks specific instructions or steps to help the user locate their card, hence it\u0027s not highly informative."
}
],
"rationale_1-suggestion": "The text is correct and provides a helpful response to the query. However, it lacks specific instructions or steps to help the user locate their card, hence it\u0027s not highly informative.",
"rationale_1-suggestion-metadata": {
"agent": "gpt-4",
"score": null,
"type": null
},
"rationale_2": [
{
"status": "submitted",
"user_id": "39bdc103-0e37-4515-a353-abe3b68bdebd",
"value": "The text is accurate and provides a more detailed response to the query. It asks for more specific information to provide a tailored solution, making it highly informative. However, it still lacks the profound insights or comprehensive information that would make it outstandingly helpful."
}
],
"rationale_2-suggestion": "The text is accurate and provides a more detailed response to the query. It asks for more specific information to provide a tailored solution, making it highly informative. However, it still lacks the profound insights or comprehensive information that would make it outstandingly helpful.",
"rationale_2-suggestion-metadata": {
"agent": "gpt-4",
"score": null,
"type": null
},
"response_1": " Thank you for reaching out! I\u0027m here to help you with your query. However, I must point out that the term \"locate\" can have different meanings depending on the context. Could you please provide more details or clarify what you mean by \"locate\"? Are you trying to find a specific card, such as a credit card or debit card? Or perhaps you\u0027re looking for something else entirely? Once I have a better understanding of your question, I\u0027ll be happy to assist you in a responsible and accurate manner.",
"response_2": " Of course! I\u0027d be happy to help you locate your missing card. To start, could you please tell me more about the card you\u0027re trying to find? Is it a credit card, debit card, or another type of card? Additionally, can you provide more details about the circumstances surrounding its disappearance? For example, did you lose it at home, in a store, or somewhere else? Any additional information you can provide will help me give you more tailored advice on how to locate your card."
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
* **input** is of type `TextField`.
* **response_1** is of type `TextField`.
* **response_2** is of type `TextField`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **rating_1** is of type `RatingQuestion` with the following allowed values [1, 2, 3, 4, 5].
* **rationale_1** is of type `TextQuestion`.
* **rating_2** is of type `RatingQuestion` with the following allowed values [1, 2, 3, 4, 5].
* **rationale_2** is of type `TextQuestion`.
* **✨ NEW** **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **rating_1-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **rationale_1-suggestion** is of type `text`.
* (optional) **rating_2-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **rationale_2-suggestion** is of type `text`.
Additionally, we also have one more field which is optional and is the following:
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-40000 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 6350560719
num_examples: 1000
download_size: 1297464742
dataset_size: 6350560719
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/ferry_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ferry/フュリー (Fire Emblem)
This is the dataset of ferry/フュリー (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `long_hair, green_hair, green_eyes, ponytail, breasts, earrings, bangs, large_breasts, very_long_hair, high_ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 723.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 409.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1211 | 856.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 643.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1211 | 1.19 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ferry_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 28 |  |  |  |  |  | 1girl, mask_on_head, official_alternate_costume, solo, chest_sarashi, tube_top, bandages, looking_at_viewer, smile, jewelry, single_bare_shoulder, cleavage, bandeau, midriff, blue_skirt, navel, single_sleeve, stomach, holding_weapon, blue_shirt, polearm, white_background, bare_shoulders, simple_background, collarbone, short_sleeves, standing, thighs |
| 1 | 5 |  |  |  |  |  | 1girl, blue_dress, fingerless_gloves, holding_sword, jewelry, looking_at_viewer, sheath, solo, katana, pelvic_curtain, short_sleeves, thighs, side_slit, black_gloves, boots, smile |
| 2 | 7 |  |  |  |  |  | 1girl, black_gloves, boots, simple_background, white_background, blue_dress, fingerless_gloves, holding_sword, pelvic_curtain, full_body, solo, side_slit |
| 3 | 10 |  |  |  |  |  | 1girl, dress, jewelry, solo, arrow_(projectile), fingerless_gloves, holding_bow_(weapon), white_background, feathers, simple_background, quiver, smile, fur_trim, hair_ornament, looking_at_viewer, pelvic_curtain, short_sleeves, cape, full_body, knee_boots, thighs, belt, closed_mouth, elbow_gloves, medium_breasts, shoulder_armor, standing |
| 4 | 8 |  |  |  |  |  | 1girl, blush, hair_flower, jewelry, official_alternate_costume, solo, bare_shoulders, blue_bikini, cleavage, looking_at_viewer, choker, collarbone, smile, simple_background, closed_mouth, navel, open_mouth, strapless_bikini, white_background |
| 5 | 7 |  |  |  |  |  | 1girl, jewelry, looking_at_viewer, ocean, outdoors, solo, beach, blue_bikini, blue_sky, cleavage, cloud, day, hair_flower, navel, official_alternate_costume, bare_shoulders, smile, strapless_bikini, blush, collarbone, thighs, water, choker, wet |
| 6 | 9 |  |  |  |  |  | bare_shoulders, bride, necklace, strapless_dress, wedding_dress, 1girl, hair_flower, official_alternate_costume, white_dress, cleavage, looking_at_viewer, solo, bouquet, open_mouth, smile, bridal_veil, detached_sleeves, gloves, medium_breasts |
| 7 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, sweat, open_mouth, mosaic_censoring, nipples, cum, erection, jewelry, licking_penis, nude, tongue_out |
| 8 | 11 |  |  |  |  |  | 1girl, blush, hetero, solo_focus, 1boy, penis, sex, open_mouth, nipples, pussy, vaginal, completely_nude, mosaic_censoring, smile, jewelry, spread_legs, sweat, looking_at_viewer, ass, dark-skinned_male, gloves, lying, navel, straddling, thighhighs |
| 9 | 6 |  |  |  |  |  | 1girl, elbow_gloves, armpits, arms_up, blush, looking_at_viewer, solo, thighhighs, white_gloves, armor, open_mouth, pole_dancing, stripper_pole, :d, censored, pussy, spread_legs, sweat, thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | mask_on_head | official_alternate_costume | solo | chest_sarashi | tube_top | bandages | looking_at_viewer | smile | jewelry | single_bare_shoulder | cleavage | bandeau | midriff | blue_skirt | navel | single_sleeve | stomach | holding_weapon | blue_shirt | polearm | white_background | bare_shoulders | simple_background | collarbone | short_sleeves | standing | thighs | blue_dress | fingerless_gloves | holding_sword | sheath | katana | pelvic_curtain | side_slit | black_gloves | boots | full_body | dress | arrow_(projectile) | holding_bow_(weapon) | feathers | quiver | fur_trim | hair_ornament | cape | knee_boots | belt | closed_mouth | elbow_gloves | medium_breasts | shoulder_armor | blush | hair_flower | blue_bikini | choker | open_mouth | strapless_bikini | ocean | outdoors | beach | blue_sky | cloud | day | water | wet | bride | necklace | strapless_dress | wedding_dress | white_dress | bouquet | bridal_veil | detached_sleeves | gloves | 1boy | hetero | solo_focus | sweat | mosaic_censoring | nipples | cum | erection | licking_penis | nude | tongue_out | penis | sex | pussy | vaginal | completely_nude | spread_legs | ass | dark-skinned_male | lying | straddling | thighhighs | armpits | arms_up | white_gloves | armor | pole_dancing | stripper_pole | :d | censored | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------------------------|:-------|:----------------|:-----------|:-----------|:--------------------|:--------|:----------|:-----------------------|:-----------|:----------|:----------|:-------------|:--------|:----------------|:----------|:-----------------|:-------------|:----------|:-------------------|:-----------------|:--------------------|:-------------|:----------------|:-----------|:---------|:-------------|:--------------------|:----------------|:---------|:---------|:-----------------|:------------|:---------------|:--------|:------------|:--------|:---------------------|:-----------------------|:-----------|:---------|:-----------|:----------------|:-------|:-------------|:-------|:---------------|:---------------|:-----------------|:-----------------|:--------|:--------------|:--------------|:---------|:-------------|:-------------------|:--------|:-----------|:--------|:-----------|:--------|:------|:--------|:------|:--------|:-----------|:------------------|:----------------|:--------------|:----------|:--------------|:-------------------|:---------|:-------|:---------|:-------------|:--------|:-------------------|:----------|:------|:-----------|:----------------|:-------|:-------------|:--------|:------|:--------|:----------|:------------------|:--------------|:------|:--------------------|:--------|:-------------|:-------------|:----------|:----------|:---------------|:--------|:---------------|:----------------|:-----|:-----------|:--------------|
| 0 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | X | | | | X | X | X | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | | | | | X | | X | | | | | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | | X | | | | X | X | X | | | | | | | | | | | | X | | X | | X | X | X | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | X | X | | | | X | X | X | | X | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | X | X | | | | X | X | X | | X | | | | X | | | | | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | X | X | | | | X | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 8 | 11 |  |  |  |  |  | X | | | | | | | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X |
|
sezenkarakus/image-dataset-v2 | ---
dataset_info:
features:
- name: file_name
dtype: string
- name: event
dtype: string
- name: place
dtype: string
- name: venue
dtype: string
- name: location
dtype: string
- name: category
dtype: string
- name: text
dtype: string
- name: description
dtype: string
- name: colors
dtype: string
- name: lifeforms
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 6820002498.125
num_examples: 21663
download_size: 6813360317
dataset_size: 6820002498.125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jadericdawson/Test1 | ---
task_categories:
- question-answering
pretty_name: Test1
size_categories:
- 1K<n<10K
language:
- en
--- |
AWeirdDev/zh-tw-articles-6k | ---
language:
- zh
license: mit
size_categories:
- 1K<n<10K
task_categories:
- text-generation
pretty_name: zh-tw-articles-2k
tags:
- medical
- finance
- webdataset
dataset_info:
features:
- name: image
dtype: string
- name: title
dtype: string
- name: content
dtype: string
- name: tag
dtype: string
- name: author
dtype: string
- name: timestamp
dtype: string
- name: link
dtype: string
splits:
- name: train
num_bytes: 15644219
num_examples: 6000
download_size: 10442214
dataset_size: 15644219
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
<img alt="Taiwan Flag"
src="https://github.com/AWeirdScratcher/tvbox/assets/90096971/1995d468-fee7-44c3-8bfb-298884d559eb"
align="right"
width="160" />
# zh-tw-articles-**6k**
This dataset contains Taiwan news articles scraped from (https://www.storm.mg) on March 2024.
- Size: `10.4MB` (15644219 bytes)
- Rows: `6000` (Max)
```python
Dataset({
features: ['image', 'title', 'content', 'tag', 'author', 'timestamp', 'link'],
num_rows: 6000
})
```
## Use The Dataset
Use [🤗 Datasets](https://pypi.org/project/datasets) to download, use or modify this dataset.
```python
from datasets import load_dataset
dataset = load_dataset("AWeirdDev/zh-tw-articles-6k")
``` |
ibranze/araproje_arc_tr_conf_gpt2_nearestscore_true_y | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 86423.0
num_examples: 250
download_size: 50655
dataset_size: 86423.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_tr_conf_gpt2_nearestscore_true_y"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
proxima/SD_1-5_reg_images | ---
license: creativeml-openrail-m
---
1k images for the class "artstyle" that were made with & for the [JoePenna Dreambooth repo](https://github.com/JoePenna/Dreambooth-Stable-Diffusion) with Stable Diffusion 1.5
```
seed: 10
ddim_eta: 0.0
scale: 10.0
ddim_steps: 50
```
|
ruanchaves/assin2_por_Latn_to_spa_Latn | ---
dataset_info:
features:
- name: sentence_pair_id
dtype: int64
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: relatedness_score
dtype: float32
- name: entailment_judgment
dtype:
class_label:
names:
'0': NONE
'1': ENTAILMENT
- name: __language__
dtype: string
splits:
- name: train
num_bytes: 926284
num_examples: 6500
- name: test
num_bytes: 359987
num_examples: 2448
- name: validation
num_bytes: 71410
num_examples: 500
download_size: 0
dataset_size: 1357681
---
# Dataset Card for "assin2_por_Latn_to_spa_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
orkg/SciQA | ---
annotations_creators:
- expert-generated
- auto-generated
language:
- en
language_creators:
- machine-generated
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: 'The SciQA Scientific Question Answering Benchmark for Scholarly Knowledge'
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- knowledge-base-qa
task_categories:
- question-answering
task_ids: []
---
# Dataset Card for SciQA
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [SciQA Homepage]()
- **Repository:** [SciQA Repository](https://zenodo.org/record/7744048)
- **Paper:** The SciQA Scientific Question Answering Benchmark for Scholarly Knowledge
- **Point of Contact:** [Yaser Jaradeh](mailto:Yaser.Jaradeh@tib.eu)
### Dataset Summary
SciQA contains 2,565 SPARQL query - question pairs along with answers fetched from the open research knowledge graph (ORKG) via a Virtuoso SPARQL endpoint, it is a collection of both handcrafted and autogenerated questions and queries. The dataset is split into 70% training, 10% validation and 20% test examples.
## Dataset Structure
### Data Instances
An example of a question is given below:
```json
{
"id": "AQ2251",
"query_type": "Factoid",
"question": {
"string": "Provide a list of papers that have utilized the Depth DDPPO model and include the links to their code?"
},
"paraphrased_question": [],
"query": {
"sparql": "SELECT DISTINCT ?code\nWHERE {\n ?model a orkgc:Model;\n rdfs:label ?model_lbl.\n FILTER (str(?model_lbl) = \"Depth DDPPO\")\n ?benchmark orkgp:HAS_DATASET ?dataset.\n ?cont orkgp:HAS_BENCHMARK ?benchmark.\n ?cont orkgp:HAS_MODEL ?model;\n orkgp:HAS_SOURCE_CODE ?code.\n}"
},
"template_id": "T07",
"auto_generated": true,
"query_shape": "Tree",
"query_class": "WHICH-WHAT",
"number_of_patterns": 4,
}
```
### Data Fields
- `id`: the id of the question
- `question`: a string containing the question
- `paraphrased_question`: a set of paraphrased versions of the question
- `query`: a SPARQL query that answers the question
- `query_type`: the type of the query
- `query_template`: an optional template of the query
- `query_shape`: a string indicating the shape of the query
- `query_class`: a string indicating the class of the query
- `auto_generated`: a boolean indicating whether the question is auto-generated or not
- `number_of_patterns`: an integer number indicating the number of gtaph patterns in the query
### Data Splits
The dataset is split into 70% training, 10% validation and 20% test questions.
## Additional Information
### Licensing Information
SciQA is licensed under the [Creative Commons Attribution 4.0 International License (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/).
### Citation Information
```bibtex
@Article{SciQA2023,
author={Auer, S{\"o}ren
and Barone, Dante A. C.
and Bartz, Cassiano
and Cortes, Eduardo G.
and Jaradeh, Mohamad Yaser
and Karras, Oliver
and Koubarakis, Manolis
and Mouromtsev, Dmitry
and Pliukhin, Dmitrii
and Radyush, Daniil
and Shilin, Ivan
and Stocker, Markus
and Tsalapati, Eleni},
title={The SciQA Scientific Question Answering Benchmark for Scholarly Knowledge},
journal={Scientific Reports},
year={2023},
month={May},
day={04},
volume={13},
number={1},
pages={7240},
abstract={Knowledge graphs have gained increasing popularity in the last decade in science and technology. However, knowledge graphs are currently relatively simple to moderate semantic structures that are mainly a collection of factual statements. Question answering (QA) benchmarks and systems were so far mainly geared towards encyclopedic knowledge graphs such as DBpedia and Wikidata. We present SciQA a scientific QA benchmark for scholarly knowledge. The benchmark leverages the Open Research Knowledge Graph (ORKG) which includes almost 170,000 resources describing research contributions of almost 15,000 scholarly articles from 709 research fields. Following a bottom-up methodology, we first manually developed a set of 100 complex questions that can be answered using this knowledge graph. Furthermore, we devised eight question templates with which we automatically generated further 2465 questions, that can also be answered with the ORKG. The questions cover a range of research fields and question types and are translated into corresponding SPARQL queries over the ORKG. Based on two preliminary evaluations, we show that the resulting SciQA benchmark represents a challenging task for next-generation QA systems. This task is part of the open competitions at the 22nd International Semantic Web Conference 2023 as the Scholarly Question Answering over Linked Data (QALD) Challenge.},
issn={2045-2322},
doi={10.1038/s41598-023-33607-z},
url={https://doi.org/10.1038/s41598-023-33607-z}
}
```
### Contributions
Thanks to [@YaserJaradeh](https://github.com/YaserJaradeh) for adding this dataset. |
anan-2024/twitter_dataset_1713157548 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 95229
num_examples: 253
download_size: 54338
dataset_size: 95229
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gate369/as-ni-json | ---
license: apache-2.0
---
|
thobauma/harmless-poisoned-0.01-SUDO-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/semeval-task-8-b-v2-mistral-7b | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: model
dtype: string
- name: source
dtype: string
- name: label
dtype: int64
- name: id
dtype: int64
- name: mistral-7b_estimated_loss
dtype: float64
- name: mistral-7b_mean_lowest25
dtype: float64
- name: mistral-7b_mean_highest25
dtype: float64
- name: mistral-7b_max
dtype: float64
- name: mistral-7b_min
dtype: float64
- name: mistral-7b_range
dtype: float64
- name: mistral-7b_mean
dtype: float64
- name: mistral-7b_std
dtype: float64
- name: mistral-7b_entropy
dtype: float64
- name: mistral-7b_kurtosis
dtype: float64
- name: mistral-7b_skewness
dtype: float64
- name: mistral-7b_perplexity
dtype: float64
splits:
- name: train
num_bytes: 127022360
num_examples: 56821
- name: val
num_bytes: 31364223
num_examples: 14206
- name: test
num_bytes: 5102312
num_examples: 3000
download_size: 96394782
dataset_size: 163488895
---
# Dataset Card for "semeval-task-8-b-v2-mistral-7b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PosteriorAI/dravida_alpaca_transliterated | ---
license: mit
---
|
oubabuliao/my-image-captioning-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '10'
'3': '11'
'4': '12'
'5': '13'
'6': '14'
'7': '15'
'8': '16'
'9': '17'
'10': '18'
'11': '19'
'12': '2'
'13': '20'
'14': '21'
'15': '22'
'16': '23'
'17': '24'
'18': '25'
'19': '26'
'20': '27'
'21': '28'
'22': '29'
'23': '3'
'24': '30'
'25': '31'
'26': '32'
'27': '33'
'28': '34'
'29': '35'
'30': '36'
'31': '37'
'32': '38'
'33': '4'
'34': '5'
'35': '6'
'36': '7'
'37': '8'
'38': '9'
splits:
- name: train
num_bytes: 146019439.0
num_examples: 39
download_size: 140460608
dataset_size: 146019439.0
---
# Dataset Card for "my-image-captioning-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SivaSegu/mini-platypus-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2253597
num_examples: 1000
download_size: 1118543
dataset_size: 2253597
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lltala/edgar_all_4-simple-no-valid-roberta-base | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: doc_id
dtype: string
- name: id
dtype: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': I-PER
'2': I-ORG
'3': I-LOC
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 7056516
num_examples: 930
- name: validation
num_bytes: 676038
num_examples: 90
download_size: 910413
dataset_size: 7732554
---
# Dataset Card for "edgar_all_4-simple-no-valid-roberta-base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Snorlax51/Harshkumar23 | ---
license: artistic-2.0
---
|
danigr7/prueba2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: bbox
sequence:
sequence: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 22925634.88
num_examples: 1440
download_size: 21379049
dataset_size: 22925634.88
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kevind13/vuejs-nuxt-tailwind-codellama-examples | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 33764
num_examples: 33
download_size: 30538
dataset_size: 33764
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ruoyunbai/eagleImage | ---
license: openrail
---
|
Memin25/biggerdatasets | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 88737364.65535891
num_examples: 1456363
- name: validation
num_bytes: 9859761.344641086
num_examples: 161819
download_size: 59799132
dataset_size: 98597126.0
---
# Dataset Card for "biggerdatasets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Kabster__Bio-Mistralv2-Squared | ---
pretty_name: Evaluation run of Kabster/Bio-Mistralv2-Squared
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kabster/Bio-Mistralv2-Squared](https://huggingface.co/Kabster/Bio-Mistralv2-Squared)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kabster__Bio-Mistralv2-Squared\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T05:14:57.667370](https://huggingface.co/datasets/open-llm-leaderboard/details_Kabster__Bio-Mistralv2-Squared/blob/main/results_2024-03-10T05-14-57.667370.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5947088232667346,\n\
\ \"acc_stderr\": 0.033001873861023094,\n \"acc_norm\": 0.605373945602307,\n\
\ \"acc_norm_stderr\": 0.033881891438851675,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.6098919620417469,\n\
\ \"mc2_stderr\": 0.015392589865179624\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5938566552901023,\n \"acc_stderr\": 0.014351656690097862,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657239593706433,\n\
\ \"acc_stderr\": 0.004736621698861176,\n \"acc_norm\": 0.8401712806213901,\n\
\ \"acc_norm_stderr\": 0.0036569821653861666\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\"\
: 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n\
\ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.7129032258064516,\n\
\ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n\
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n\
\ \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335833,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335833\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316554,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n\
\ \"acc_stderr\": 0.015060381730018103,\n \"acc_norm\": 0.28268156424581004,\n\
\ \"acc_norm_stderr\": 0.015060381730018103\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.0264930332251459,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.0264930332251459\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42894393741851367,\n\
\ \"acc_stderr\": 0.012640625443067361,\n \"acc_norm\": 0.42894393741851367,\n\
\ \"acc_norm_stderr\": 0.012640625443067361\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n\
\ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5996732026143791,\n \"acc_stderr\": 0.019821843688271758,\n \
\ \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.019821843688271758\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.6098919620417469,\n\
\ \"mc2_stderr\": 0.015392589865179624\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712666\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \
\ \"acc_stderr\": 0.0010717793485492668\n }\n}\n```"
repo_url: https://huggingface.co/Kabster/Bio-Mistralv2-Squared
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|arc:challenge|25_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|gsm8k|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hellaswag|10_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|winogrande|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|winogrande|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T05-14-57.667370.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- results_2024-03-09T23-11-18.304357.parquet
- split: 2024_03_10T05_14_57.667370
path:
- results_2024-03-10T05-14-57.667370.parquet
- split: latest
path:
- results_2024-03-10T05-14-57.667370.parquet
---
# Dataset Card for Evaluation run of Kabster/Bio-Mistralv2-Squared
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kabster/Bio-Mistralv2-Squared](https://huggingface.co/Kabster/Bio-Mistralv2-Squared) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kabster__Bio-Mistralv2-Squared",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T05:14:57.667370](https://huggingface.co/datasets/open-llm-leaderboard/details_Kabster__Bio-Mistralv2-Squared/blob/main/results_2024-03-10T05-14-57.667370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5947088232667346,
"acc_stderr": 0.033001873861023094,
"acc_norm": 0.605373945602307,
"acc_norm_stderr": 0.033881891438851675,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.6098919620417469,
"mc2_stderr": 0.015392589865179624
},
"harness|arc:challenge|25": {
"acc": 0.5938566552901023,
"acc_stderr": 0.014351656690097862,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.657239593706433,
"acc_stderr": 0.004736621698861176,
"acc_norm": 0.8401712806213901,
"acc_norm_stderr": 0.0036569821653861666
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.0250437573185202,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.0250437573185202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7129032258064516,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.7129032258064516,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335833,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335833
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.02541600377316554,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.02541600377316554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28268156424581004,
"acc_stderr": 0.015060381730018103,
"acc_norm": 0.28268156424581004,
"acc_norm_stderr": 0.015060381730018103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893934,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893934
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42894393741851367,
"acc_stderr": 0.012640625443067361,
"acc_norm": 0.42894393741851367,
"acc_norm_stderr": 0.012640625443067361
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.019821843688271758,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.019821843688271758
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.6098919620417469,
"mc2_stderr": 0.015392589865179624
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712666
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492668
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hemz/dataset-test01 | ---
license: apache-2.0
---
|
nlp-with-deeplearning/ko.openhermes | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- en
- ko
tags:
- distillation
- synthetic data
- gpt
size_categories:
- 100K<n<1M
---
원본 데이터셋: [teknium/openhermes](https://huggingface.co/datasets/teknium/openhermes) |
BangumiBase/saijakutamerwagomihiroinotabiwohajimemashita | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Saijaku Tamer Wa Gomi Hiroi No Tabi Wo Hajimemashita
This is the image base of bangumi Saijaku Tamer wa Gomi Hiroi no Tabi wo Hajimemashita, we detected 81 characters, 6058 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 2355 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 37 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 161 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 34 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 20 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 23 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 16 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 16 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 31 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 47 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 128 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 6 | [Download](11/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 12 | 59 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 116 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 216 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 24 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 420 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 128 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 32 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 37 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 27 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 31 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 11 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 248 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 30 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 48 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 15 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 18 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 14 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 23 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 8 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 42 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 44 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 13 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 95 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 63 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 19 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 33 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 14 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 10 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 10 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 10 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 33 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 9 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 14 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 84 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 10 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 11 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 28 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 11 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 16 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 7 | [Download](51/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 52 | 37 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 7 | [Download](53/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 54 | 65 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 18 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 9 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 9 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 17 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 21 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 12 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 14 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 9 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 9 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 8 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 14 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 6 | [Download](66/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 67 | 337 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 15 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 21 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 138 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 32 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 39 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 58 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 29 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 28 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| 76 | 9 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| 77 | 8 | [Download](77/dataset.zip) |  |  |  |  |  |  |  |  |
| 78 | 14 | [Download](78/dataset.zip) |  |  |  |  |  |  |  |  |
| 79 | 8 | [Download](79/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 112 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
EleutherAI/quirky_squaring_raw | ---
dataset_info:
features:
- name: id
dtype: string
- name: template_args
struct:
- name: character
dtype: string
- name: op1
dtype: int64
- name: result
dtype: int64
- name: character
dtype: string
- name: label
dtype: bool
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: difficulty_quantile
dtype: float64
splits:
- name: train
num_bytes: 11109000
num_examples: 184000
- name: validation
num_bytes: 483000
num_examples: 8000
- name: test
num_bytes: 483000
num_examples: 8000
download_size: 6842851
dataset_size: 12075000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-staging-eval-samsum-samsum-f593d1-14645992 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP13
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP13
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
fasterinnerlooper/codereviewer | ---
configs:
- config_name: train_generation
data_files:
- "generation/gen-train.jsonl"
- config_name: test_generation
data_files:
- "generation/gen-test.jsonl"
- config_name: validation_generation
data_files:
- "generation/gen-valid.jsonl"
- config_name: train_refinement
data_files:
- "refinement/ref-train.jsonl"
- config_name: test_refinement
data_files:
- "refinement/ref-test.jsonl"
- config_name: validation_refinement
data_files:
- "refinement/ref-valid.jsonl"
- config_name: train_quality
data_files:
- "quality/cls-train-chunk-0.jsonl"
- "quality/cls-train-chunk-1.jsonl"
- "quality/cls-train-chunk-2.jsonl"
- "quality/cls-train-chunk-3.jsonl"
- config_name: test_quality
data_files:
- "quality/cls-test.jsonl"
- config_name: validation_quality
data_files:
- "quality/cls-valid.jsonl"
--- |
ArmelR/the-pile-splitted | ---
configs:
- config_name: all
data_files:
- split: train
path:
- "data/ArXiv/train/*.arrow"
- "data/BookCorpus2/train/*.arrow"
- "data/Books3/train/*.arrow"
- "data/DM Mathematics/train/*.arrow"
- "data/Enron Emails/train/*.arrow"
- "data/EuroParl/train/*.arrow"
- "data/FreeLaw/train/*.arrow"
- "data/Github/train/*.arrow"
- "data/Gutenberg (PG-19)/train/*.arrow"
- "data/HackerNews/train/*.arrow"
- "data/NIH ExPorter/train/*.arrow"
- "data/OpenSubtitles/train/*.arrow"
- "data/OpenWebText2/train/*.arrow"
- "data/PhilPapers/train/*.arrow"
- "data/Pile-CC/train/*.arrow"
- "data/PubMed Abstracts/train/*.arrow"
- "data/PubMed Central/train/*.arrow"
- "data/StackExchange/train/*.arrow"
- "data/UPSTO Backgrounds/train/*.arrow"
- "data/Ubuntu IRC/train/*.arrow"
- "data/Wikipedia (en)/train/*.arrow"
- "data/YoutubeSubtitles/train/*.arrow"
- split: test
path:
- "data/ArXiv/test/*.arrow"
- "data/BookCorpus2/test/*.arrow"
- "data/Books3/test/*.arrow"
- "data/DM Mathematics/test/*.arrow"
- "data/Enron Emails/test/*.arrow"
- "data/EuroParl/test/*.arrow"
- "data/FreeLaw/test/*.arrow"
- "data/Github/test/*.arrow"
- "data/Gutenberg (PG-19)/test/*.arrow"
- "data/HackerNews/test/*.arrow"
- "data/NIH ExPorter/test/*.arrow"
- "data/OpenSubtitles/test/*.arrow"
- "data/OpenWebText2/test/*.arrow"
- "data/PhilPapers/test/*.arrow"
- "data/Pile-CC/test/*.arrow"
- "data/PubMed Abstracts/test/*.arrow"
- "data/PubMed Central/test/*.arrow"
- "data/StackExchange/test/*.arrow"
- "data/UPSTO Backgrounds/test/*.arrow"
- "data/Ubuntu IRC/test/*.arrow"
- "data/Wikipedia (en)/test/*.arrow"
- "data/YoutubeSubtitles/test/*.arrow"
default: true
- config_name: ArXiv
data_files:
- split: train
path: "data/ArXiv/train/*.arrow"
- split: test
path: "data/ArXiv/test/*.arrow"
- config_name: BookCorpus2
data_files:
- split: train
path: "data/BookCorpus2/train/*.arrow"
- split: test
path: "data/BookCorpus2/test/*.arrow"
- config_name: Books3
data_files:
- split: train
path: "data/Books3/train/*.arrow"
- split: test
path: "data/Books3/test/*.arrow"
- config_name: DM Mathematics
data_files:
- split: train
path: "data/DM Mathematics/train/*.arrow"
- split: test
path: "data/DM Mathematics/test/*.arrow"
- config_name: Enron Emails
data_files:
- split: train
path: "data/Enron Emails/train/*.arrow"
- split: test
path: "data/Enron Emails/test/*.arrow"
- config_name: EuroParl
data_files:
- split: train
path: "data/EuroParl/train/*.arrow"
- split: test
path: "data/EuroParl/test/*.arrow"
- config_name: FreeLaw
data_files:
- split: train
path: "data/FreeLaw/train/*.arrow"
- split: test
path: "data/FreeLaw/test/*.arrow"
- config_name: Github
data_files:
- split: train
path: "data/Github/train/*.arrow"
- split: test
path: "data/Github/test/*.arrow"
- config_name: Gutenberg (PG-19)
data_files:
- split: train
path: "data/Gutenberg (PG-19)/train/*.arrow"
- split: test
path: "data/Gutenberg (PG-19)/test/*.arrow"
- config_name: HackerNews
data_files:
- split: train
path: "data/HackerNews/train/*.arrow"
- split: test
path: "data/HackerNews/test/*.arrow"
- config_name: NIH ExPorter
data_files:
- split: train
path: "data/NIH ExPorter/train/*.arrow"
- split: test
path: "data/NIH ExPorter/test/*.arrow"
- config_name: OpenSubtitles
data_files:
- split: train
path: "data/OpenSubtitles/train/*.arrow"
- split: test
path: "data/OpenSubtitles/test/*.arrow"
- config_name: OpenWebText2
data_files:
- split: train
path: "data/OpenWebText2/train/*.arrow"
- split: test
path: "data/OpenWebText2/test/*.arrow"
- config_name: PhilPapers
data_files:
- split: train
path: "data/PhilPapers/train/*.arrow"
- split: test
path: "data/PhilPapers/test/*.arrow"
- config_name: Pile-CC
data_files:
- split: train
path: "data/Pile-CC/train/*.arrow"
- split: test
path: "data/Pile-CC/test/*.arrow"
- config_name: PubMed Abstracts
data_files:
- split: train
path: "data/PubMed Abstracts/train/*.arrow"
- split: test
path: "data/PubMed Abstracts/test/*.arrow"
- config_name: PubMed Central
data_files:
- split: train
path: "data/PubMed Central/train/*.arrow"
- split: test
path: "data/PubMed Central/test/*.arrow"
- config_name: StackExchange
data_files:
- split: train
path: "data/StackExchange/train/*.arrow"
- split: test
path: "data/StackExchange/test/*.arrow"
- config_name: UPSTO Backgrounds
data_files:
- split: train
path: "data/UPSTO Backgrounds/train/*.arrow"
- split: test
path: "data/UPSTO Backgrounds/test/*.arrow"
- config_name: Ubuntu IRC
data_files:
- split: train
path: "data/Ubuntu IRC/train/*.arrow"
- split: test
path: "data/Ubuntu IRC/test/*.arrow"
- config_name: Wikipedia (en)
data_files:
- split: train
path: "data/Wikipedia (en)/train/*.arrow"
- split: test
path: "data/Wikipedia (en)/test/*.arrow"
- config_name: YoutubeSubtitles
data_files:
- split: train
path: "data/YoutubeSubtitles/train/*.arrow"
- split: test
path: "data/YoutubeSubtitles/test/*.arrow"
---
# Dataset description
[The pile](https://arxiv.org/abs/2101.00027) is an 800GB dataset of english text
designed by EleutherAI to train large-scale language models. The original version of
the dataset can be found [here](https://huggingface.co/datasets/EleutherAI/pile).
The dataset is divided into 22 smaller high-quality datasets. For more information
each of them, please refer to [the datasheet for the pile](https://arxiv.org/abs/2201.07311).
However, the current version of the dataset, available on the Hub, is not splitted accordingly.
We had to solve this problem in order to improve the user experience when it comes to deal with
the pile via the hub.
Here is an instance of the pile
```
{
'meta': {'pile_set_name': 'Pile-CC'},
'text': 'It is done, and submitted. You can play “Survival of the Tastiest” on Android, and on the web. Playing on...'
}
```
We used the `meta` column to properly divide the dataset in subsets. Each instance `example` belongs to the subset
`domain` and `domain = example['meta']['pile_set_name']`. By doing this, we were able to create a [new version of the pile](https://huggingface.co/datasets/ArmelR/sharded-pile)
that is properly divided, each instance having a new column `domain`.
We further splitted each subset in train/test (97%/3%) to build the current dataset which the following structure
```
data
ArXiv
train
test
BookCorpus2
train
test
Books3
train
test
```
# Usage
```python
from datasets import load_dataset
dataset = load_dataset(
"ArmelR/the-pile-splitted",
subset_of_interest,
num_proc=8
)
```
Using `subset_of_interest = "default"` will load the whole dataset.
|
ibranze/araproje_hellaswag_tr_conf_mgpt_farestscore_all | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 87016
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_mgpt_farestscore_all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
osi1880vr/prompt_quill | ---
license: apache-2.0
---
This is the data you can use when running Prompt Quill
Prompt Quill you can find here:
https://github.com/osi1880vr/prompt_quill/tree/main
The main idea of Prompt Quill is to more easy produce text to image prompts.
To do this I created a RAG workflow which is using more than 1.5 million prompts from civitai.
In the Prompt Quill UI you can edit the magic prompt with a history that will alow you to take controll over the
style the prompts get generated.
Play around and have fun
|
kenhktsui/open-react-retrieval-multi-neg-result-new-kw | ---
dataset_info:
features:
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
- name: meta
struct:
- name: first_search_rank
dtype: int64
- name: second_search
dtype: bool
- name: second_search_success
dtype: bool
- name: source
dtype: string
splits:
- name: train
num_bytes: 83579841
num_examples: 25158
download_size: 21996450
dataset_size: 83579841
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "open-react-retrieval-multi-neg-result-new-kw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mapama247/wikihow_es | ---
pretty_name: WikiHow-ES
license: cc-by-nc-sa-3.0
size_categories: 1K<n<10K
language: es
multilinguality: monolingual
task_categories:
- text-classification
- question-answering
- conversational
- summarization
tags:
- Spanish
- WikiHow
- Wiki Articles
- Tutorials
- Step-By-Step
- Instruction Tuning
---
### Dataset Summary
Articles retrieved from the [Spanish WikiHow website](https://es.wikihow.com) on September 2023.
Each article contains a tutorial about a specific topic. The format is always a "How to" question
followed by a detailed step-by-step explanation. In some cases, the response includes several methods.
The main idea is to use this data for instruction tuning of Spanish LLMs, but given its nature it
could also be used for other tasks such as text classification or summarization.
### Languages
- Spanish (ES)
### Usage
To load the full dataset:
```python
from datasets import load_dataset
all_articles = load_dataset("mapama247/wikihow_es", trust_remote_code=True)
print(all_articles.num_rows) # output: {'train': 7380}
```
To load only examples from a specific category:
```python
from datasets import load_dataset
sports_articles = load_dataset("mapama247/wikihow_es", "deportes")
print(sports_articles.num_rows) # output: {'train': 201}
```
List of available categories, with the repective number of examples:
```
computadoras-y-electrónica 821
salud 804
pasatiempos 729
cuidado-y-estilo-personal 724
carreras-y-educación 564
en-la-casa-y-el-jardín 496
finanzas-y-negocios 459
comida-y-diversión 454
relaciones 388
mascotas-y-animales 338
filosofía-y-religión 264
arte-y-entretenimiento 254
en-el-trabajo 211
adolescentes 201
deportes 201
vida-familiar 147
viajes 139
automóviles-y-otros-vehículos 100
días-de-fiesta-y-tradiciones 86
```
### Supported Tasks
This dataset can be used to train a model for...
- `instruction-tuning`
- `text-classification`
- `question-answering`
- `conversational`
- `summarization`
## Dataset Structure
### Data Instances
```python
{
'category': str,
'question': str,
'introduction': str,
'answers': List[str],
'short_answers': List[str],
'url': str,
'num_answers': int,
'num_refs': int,
'expert_author': bool,
}
```
### Data Fields
- `category`: The category (from [this list](https://es.wikihow.com/Especial:CategoryListing)) to which the example belongs to.
- `label`: Numerical representation of the category, for text classification purposes.
- `question`: The article's title, which always starts with "¿Cómo ...".
- `introduction`: Introductory text that precedes the step-by-step explanation.
- `answers`: List of complete answers, with the full explanation of each step.
- `short_answers`: List of shorter answers that only contain one-sentence steps.
- `num_answers`: The number of alternative answers provided (e.g. length of `answers`).
- `num_ref`: Number of references provided in the article.
- `expert_authors`: Whether the article's author claims to be an expert on the topic or not.
- `url`: The URL address of the original article.
### Data Splits
There is only one split (`train`) that contains a total of 7,380 examples.
## Dataset Creation
### Curation Rationale
This dataset was created for language model alignment to end tasks and user preferences.
### Source Data
How-To questions with detailed step-by-step answers, retrieved from the WikiHow website.
#### Data Collection and Normalization
All articles available in September 2023 were extracted, no filters applied.
Along with the article's content, some metadata was retrieved as well.
#### Source language producers
WikiHow users. All the content is human-generated.
### Personal and Sensitive Information
The data does not include personal or sensitive information.
## Considerations
### Social Impact
The Spanish community can benefit from the high-quality data provided by this dataset.
### Bias
No post-processing steps have been applied to mitigate potential social biases.
## Additional Information
### Curators
Marc Pàmes @ Barcelona Supercomputing Center.
### License
This dataset is licensed under a **Creative Commons CC BY-NC-SA 3.0** license.
Quote from [WikiHow's Terms of Use](https://www.wikihow.com/wikiHow:Terms-of-Use):
> All text posted by Users to the Service is sub-licensed by wikiHow to other Users under a Creative Commons license as
> provided herein. The Creative Commons license allows such user generated text content to be used freely for personal,
> non-commercial purposes, so long as it is used and attributed to the original author as specified under the terms of
> the license. Allowing free republication of our articles helps wikiHow achieve its mission by providing instruction
> on solving the problems of everyday life to more people for free. In order to support this goal, wikiHow hereby grants
> each User of the Service a license to all text content that Users contribute to the Service under the terms and
> conditions of a Creative Commons CC BY-NC-SA 3.0 License. Please be sure to read the terms of the license carefully.
> You continue to own all right, title, and interest in and to your User Content, and you are free to distribute it as
> you wish, whether for commercial or non-commercial purposes.
|
SuperNova672/cord-10k-processed | ---
dataset_info:
features:
- name: data
dtype: string
splits:
- name: train
num_bytes: 524148223
num_examples: 695729
download_size: 275228391
dataset_size: 524148223
---
# Dataset Card for "cord-10k-processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BobaZooba/tmp_dataset | ---
dataset_info:
features:
- name: hello
dtype: string
splits:
- name: train
num_bytes: 19
num_examples: 2
download_size: 780
dataset_size: 19
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tmp_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-80K | ---
pretty_name: Evaluation run of marcchew/Marcoroni-7B-LaMini-80K
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [marcchew/Marcoroni-7B-LaMini-80K](https://huggingface.co/marcchew/Marcoroni-7B-LaMini-80K)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-80K\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T19:17:19.895055](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-80K/blob/main/results_2023-12-03T19-17-19.895055.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\
acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \
\ \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/marcchew/Marcoroni-7B-LaMini-80K
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|arc:challenge|25_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T08_02_52.884764
path:
- '**/details_harness|drop|3_2023-10-24T08-02-52.884764.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T08-02-52.884764.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T08_02_52.884764
path:
- '**/details_harness|gsm8k|5_2023-10-24T08-02-52.884764.parquet'
- split: 2023_12_03T19_17_19.895055
path:
- '**/details_harness|gsm8k|5_2023-12-03T19-17-19.895055.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T19-17-19.895055.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hellaswag|10_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T08_02_52.884764
path:
- '**/details_harness|winogrande|5_2023-10-24T08-02-52.884764.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T08-02-52.884764.parquet'
- config_name: results
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- results_2023-09-21T20-12-12.451376.parquet
- split: 2023_10_24T08_02_52.884764
path:
- results_2023-10-24T08-02-52.884764.parquet
- split: 2023_12_03T19_17_19.895055
path:
- results_2023-12-03T19-17-19.895055.parquet
- split: latest
path:
- results_2023-12-03T19-17-19.895055.parquet
---
# Dataset Card for Evaluation run of marcchew/Marcoroni-7B-LaMini-80K
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/marcchew/Marcoroni-7B-LaMini-80K
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [marcchew/Marcoroni-7B-LaMini-80K](https://huggingface.co/marcchew/Marcoroni-7B-LaMini-80K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-80K",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T19:17:19.895055](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-80K/blob/main/results_2023-12-03T19-17-19.895055.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ovior/twitter_dataset_1712989285 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2521078
num_examples: 7861
download_size: 1405794
dataset_size: 2521078
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-preference-64-nsample-2_iso_filter_gold_thr_0.1_self_70m | ---
dataset_info:
config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43220385
num_examples: 18929
- name: epoch_1
num_bytes: 43903508
num_examples: 18929
- name: epoch_2
num_bytes: 43953581
num_examples: 18929
- name: epoch_3
num_bytes: 43980020
num_examples: 18929
- name: epoch_4
num_bytes: 43991276
num_examples: 18929
- name: epoch_5
num_bytes: 44002974
num_examples: 18929
- name: epoch_6
num_bytes: 44006203
num_examples: 18929
- name: epoch_7
num_bytes: 44011361
num_examples: 18929
- name: epoch_8
num_bytes: 43989099
num_examples: 18929
- name: epoch_9
num_bytes: 43992463
num_examples: 18929
download_size: 632651564
dataset_size: 439050870
configs:
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
---
|
open-llm-leaderboard/details_ZoidBB__MultiKory-0.1-4x11b-pre1 | ---
pretty_name: Evaluation run of ZoidBB/MultiKory-0.1-4x11b-pre1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ZoidBB/MultiKory-0.1-4x11b-pre1](https://huggingface.co/ZoidBB/MultiKory-0.1-4x11b-pre1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZoidBB__MultiKory-0.1-4x11b-pre1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-05T08:07:31.524035](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__MultiKory-0.1-4x11b-pre1/blob/main/results_2024-02-05T08-07-31.524035.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6506701138513552,\n\
\ \"acc_stderr\": 0.032251654043750994,\n \"acc_norm\": 0.6513642747647624,\n\
\ \"acc_norm_stderr\": 0.032921546302611786,\n \"mc1\": 0.5214198286413708,\n\
\ \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6767314029001894,\n\
\ \"mc2_stderr\": 0.01524427540483159\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.013273077865907588,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545803\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6954789882493527,\n\
\ \"acc_stderr\": 0.004592637369905791,\n \"acc_norm\": 0.879008165704043,\n\
\ \"acc_norm_stderr\": 0.0032545129328064\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246572,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44692737430167595,\n\
\ \"acc_stderr\": 0.016628030039647614,\n \"acc_norm\": 0.44692737430167595,\n\
\ \"acc_norm_stderr\": 0.016628030039647614\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700033,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700033\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304324,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304324\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5214198286413708,\n\
\ \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6767314029001894,\n\
\ \"mc2_stderr\": 0.01524427540483159\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8539857932123125,\n \"acc_stderr\": 0.009924440374585243\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6095526914329037,\n \
\ \"acc_stderr\": 0.01343782986466858\n }\n}\n```"
repo_url: https://huggingface.co/ZoidBB/MultiKory-0.1-4x11b-pre1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|arc:challenge|25_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|gsm8k|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hellaswag|10_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T08-07-31.524035.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T08-07-31.524035.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- '**/details_harness|winogrande|5_2024-02-05T08-07-31.524035.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-05T08-07-31.524035.parquet'
- config_name: results
data_files:
- split: 2024_02_05T08_07_31.524035
path:
- results_2024-02-05T08-07-31.524035.parquet
- split: latest
path:
- results_2024-02-05T08-07-31.524035.parquet
---
# Dataset Card for Evaluation run of ZoidBB/MultiKory-0.1-4x11b-pre1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ZoidBB/MultiKory-0.1-4x11b-pre1](https://huggingface.co/ZoidBB/MultiKory-0.1-4x11b-pre1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZoidBB__MultiKory-0.1-4x11b-pre1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T08:07:31.524035](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__MultiKory-0.1-4x11b-pre1/blob/main/results_2024-02-05T08-07-31.524035.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6506701138513552,
"acc_stderr": 0.032251654043750994,
"acc_norm": 0.6513642747647624,
"acc_norm_stderr": 0.032921546302611786,
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6767314029001894,
"mc2_stderr": 0.01524427540483159
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.013273077865907588,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545803
},
"harness|hellaswag|10": {
"acc": 0.6954789882493527,
"acc_stderr": 0.004592637369905791,
"acc_norm": 0.879008165704043,
"acc_norm_stderr": 0.0032545129328064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138208,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246572,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44692737430167595,
"acc_stderr": 0.016628030039647614,
"acc_norm": 0.44692737430167595,
"acc_norm_stderr": 0.016628030039647614
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700033,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700033
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304324,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304324
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6767314029001894,
"mc2_stderr": 0.01524427540483159
},
"harness|winogrande|5": {
"acc": 0.8539857932123125,
"acc_stderr": 0.009924440374585243
},
"harness|gsm8k|5": {
"acc": 0.6095526914329037,
"acc_stderr": 0.01343782986466858
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Kukedlc__Neural-Krishna-Multiverse-7b-v3 | ---
pretty_name: Evaluation run of Kukedlc/Neural-Krishna-Multiverse-7b-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/Neural-Krishna-Multiverse-7b-v3](https://huggingface.co/Kukedlc/Neural-Krishna-Multiverse-7b-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__Neural-Krishna-Multiverse-7b-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T13:55:26.774118](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Neural-Krishna-Multiverse-7b-v3/blob/main/results_2024-03-14T13-55-26.774118.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.651948480086034,\n\
\ \"acc_stderr\": 0.03204581841636779,\n \"acc_norm\": 0.651065777780926,\n\
\ \"acc_norm_stderr\": 0.03271991116015648,\n \"mc1\": 0.6230110159118727,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.773939891269351,\n\
\ \"mc2_stderr\": 0.013794028338835833\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403513,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545797\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.714299940250946,\n\
\ \"acc_stderr\": 0.004508239594503832,\n \"acc_norm\": 0.8906592312288388,\n\
\ \"acc_norm_stderr\": 0.0031142850772280313\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n\
\ \"acc_stderr\": 0.016583881958602394,\n \"acc_norm\": 0.43575418994413406,\n\
\ \"acc_norm_stderr\": 0.016583881958602394\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n\
\ \"acc_stderr\": 0.012759117066518015,\n \"acc_norm\": 0.4791395045632334,\n\
\ \"acc_norm_stderr\": 0.012759117066518015\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6230110159118727,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.773939891269351,\n\
\ \"mc2_stderr\": 0.013794028338835833\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7035633055344959,\n \
\ \"acc_stderr\": 0.012579398235589526\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/Neural-Krishna-Multiverse-7b-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|arc:challenge|25_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|gsm8k|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hellaswag|10_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T13-55-26.774118.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T13-55-26.774118.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- '**/details_harness|winogrande|5_2024-03-14T13-55-26.774118.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T13-55-26.774118.parquet'
- config_name: results
data_files:
- split: 2024_03_14T13_55_26.774118
path:
- results_2024-03-14T13-55-26.774118.parquet
- split: latest
path:
- results_2024-03-14T13-55-26.774118.parquet
---
# Dataset Card for Evaluation run of Kukedlc/Neural-Krishna-Multiverse-7b-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/Neural-Krishna-Multiverse-7b-v3](https://huggingface.co/Kukedlc/Neural-Krishna-Multiverse-7b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__Neural-Krishna-Multiverse-7b-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T13:55:26.774118](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Neural-Krishna-Multiverse-7b-v3/blob/main/results_2024-03-14T13-55-26.774118.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.651948480086034,
"acc_stderr": 0.03204581841636779,
"acc_norm": 0.651065777780926,
"acc_norm_stderr": 0.03271991116015648,
"mc1": 0.6230110159118727,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.773939891269351,
"mc2_stderr": 0.013794028338835833
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.013284525292403513,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545797
},
"harness|hellaswag|10": {
"acc": 0.714299940250946,
"acc_stderr": 0.004508239594503832,
"acc_norm": 0.8906592312288388,
"acc_norm_stderr": 0.0031142850772280313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43575418994413406,
"acc_stderr": 0.016583881958602394,
"acc_norm": 0.43575418994413406,
"acc_norm_stderr": 0.016583881958602394
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.012759117066518015,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.012759117066518015
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6230110159118727,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.773939891269351,
"mc2_stderr": 0.013794028338835833
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.7035633055344959,
"acc_stderr": 0.012579398235589526
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_AA051610__T1C | ---
pretty_name: Evaluation run of AA051610/T1C
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/T1C](https://huggingface.co/AA051610/T1C) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__T1C\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T15:21:34.954726](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__T1C/blob/main/results_2023-10-11T15-21-34.954726.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5614045523007456,\n\
\ \"acc_stderr\": 0.034472805150990236,\n \"acc_norm\": 0.5650409022375938,\n\
\ \"acc_norm_stderr\": 0.03446466967324352,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42517178573631115,\n\
\ \"mc2_stderr\": 0.01461529390566251\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4709897610921502,\n \"acc_stderr\": 0.014586776355294316,\n\
\ \"acc_norm\": 0.5017064846416383,\n \"acc_norm_stderr\": 0.01461130570505699\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5382393945429197,\n\
\ \"acc_stderr\": 0.004975167382061832,\n \"acc_norm\": 0.7220673172674766,\n\
\ \"acc_norm_stderr\": 0.004470644845242893\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.043192236258113324,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.043192236258113324\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981748,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981748\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.02951470358398177,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.02951470358398177\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n\
\ \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n\
\ \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n \"\
acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572284,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533084,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507382,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507382\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7541284403669725,\n \"acc_stderr\": 0.018461940968708443,\n \"\
acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.018461940968708443\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6029411764705882,\n \"acc_stderr\": 0.03434131164719129,\n \"\
acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.03434131164719129\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.7624521072796935,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.02629622791561367,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.02629622791561367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.01536686038639711,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.01536686038639711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994098,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994098\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.0200176292142131,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.0200176292142131\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087548,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087548\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42517178573631115,\n\
\ \"mc2_stderr\": 0.01461529390566251\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/T1C
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|arc:challenge|25_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hellaswag|10_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T15-21-34.954726.parquet'
- config_name: results
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- results_2023-10-11T15-21-34.954726.parquet
- split: latest
path:
- results_2023-10-11T15-21-34.954726.parquet
---
# Dataset Card for Evaluation run of AA051610/T1C
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AA051610/T1C
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AA051610/T1C](https://huggingface.co/AA051610/T1C) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__T1C",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T15:21:34.954726](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__T1C/blob/main/results_2023-10-11T15-21-34.954726.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5614045523007456,
"acc_stderr": 0.034472805150990236,
"acc_norm": 0.5650409022375938,
"acc_norm_stderr": 0.03446466967324352,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.42517178573631115,
"mc2_stderr": 0.01461529390566251
},
"harness|arc:challenge|25": {
"acc": 0.4709897610921502,
"acc_stderr": 0.014586776355294316,
"acc_norm": 0.5017064846416383,
"acc_norm_stderr": 0.01461130570505699
},
"harness|hellaswag|10": {
"acc": 0.5382393945429197,
"acc_stderr": 0.004975167382061832,
"acc_norm": 0.7220673172674766,
"acc_norm_stderr": 0.004470644845242893
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.043192236258113324,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.043192236258113324
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981748,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981748
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.02951470358398177,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.02951470358398177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572284,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533084,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507382,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507382
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7541284403669725,
"acc_stderr": 0.018461940968708443,
"acc_norm": 0.7541284403669725,
"acc_norm_stderr": 0.018461940968708443
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.03434131164719129,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.03434131164719129
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7624521072796935,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.7624521072796935,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.02629622791561367,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.02629622791561367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.01536686038639711,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.01536686038639711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515961,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515961
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994098,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087548,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087548
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.42517178573631115,
"mc2_stderr": 0.01461529390566251
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Nexdata/French_Speaking_English_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/French_Speaking_English_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/989?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
1089 French native speakers participated in the recording with authentic accent. The recorded script is designed by linguists and cover a wide range of topics including generic, interactive, on-board and home. The text is manually proofread with high accuracy. It matches with mainstream Android and Apple system phones. The data set can be applied for automatic speech recognition, and machine translation scenes.
For more details, please refer to the link: https://www.nexdata.ai/datasets/989?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
French English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
graizelle/real_gurls_data | ---
license: cc-by-sa-4.0
tags:
- tgirl
- trans-female
pretty_name: Real Gurls dataset
size_categories:
- n<1K
---
Dataset used for Real Gurls models.
NSFW trans females. |
TMZN/baidubaike | ---
license: gpl-3.0
---
# 如有侵权请联系删除
# 感谢词典圈的阿弥陀佛大佬,这些均是他的大作。现在上传到此地,并不是我个人作品。
# 请使用MdxExport.exe进行反编译处理,从而导出HTML便于后续手搓数据集。如果不行的话,请使用https://github.com/liuyug/mdict-utils |
Phoebechen123/cifar10c_snow | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 47962052.0
num_examples: 50000
download_size: 19565402
dataset_size: 47962052.0
---
# Dataset Card for "cifar10c_snow"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
khalidalt/HuffPost | ---
license: cc0-1.0
---
# Dataset Card for HuffPost
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:https://www.kaggle.com/datasets/rmisra/news-category-dataset/metadata**
### Dataset Summary
A dataset of approximately 200K news headlines from the year 2012 to 2018 collected from HuffPost.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
license: cc0-1.0
### Citation Information
```
@book{book,
author = {Misra, Rishabh and Grover, Jigyasa},
year = {2021},
month = {01},
pages = {},
title = {Sculpting Data for ML: The first act of Machine Learning},
isbn = {978-0-578-83125-1}
}
@dataset{dataset,
author = {Misra, Rishabh},
year = {2018},
month = {06},
pages = {},
title = {News Category Dataset},
doi = {10.13140/RG.2.2.20331.18729}
}
```
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
316usman/thematic1d_rr | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 82397997.40905319
num_examples: 129936
download_size: 29755891
dataset_size: 82397997.40905319
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_160 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1188920176
num_examples: 231668
download_size: 1214023103
dataset_size: 1188920176
---
# Dataset Card for "chunk_160"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mondaugen/IMBD | ---
license: other
license_name: imdb
license_link: https://developer.imdb.com/non-commercial-datasets/
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.