datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Asimok/KGLQA-KeySentenceSelect-CCLUE-MRC | ---
configs:
- config_name: raw
data_files:
- split: train
path:
- "KGLQA-KeySentenceSelect-CCLUE-MRC-raw/train.jsonl"
- split: dev
path:
- "KGLQA-KeySentenceSelect-CCLUE-MRC-raw/dev.jsonl"
- split: test
path:
- "KGLQA-KeySentenceSelect-CCLUE-MRC-raw/test.jsonl"
- config_name: normal
data_files:
- split: train
path:
- "KGLQA-KeySentenceSelect-CCLUE-MRC/train.jsonl"
- split: dev
path:
- "KGLQA-KeySentenceSelect-CCLUE-MRC/dev.jsonl"
- split: test
path:
- "KGLQA-KeySentenceSelect-CCLUE-MRC/test.jsonl"
- config_name: instruct
data_files:
- split: train
path:
- "KGLQA-KeySentenceSelect-CCLUE-MRC-instruct/train.jsonl"
- split: dev
path:
- "KGLQA-KeySentenceSelect-CCLUE-MRC-instruct/dev.jsonl"
- split: test
path:
- "KGLQA-KeySentenceSelect-CCLUE-MRC-instruct/test.jsonl"
---
|
ChigozieAnyaejiP/Interview_questions | ---
license: mit
task_categories:
- question-answering
language:
- en
size_categories:
- n<1K
--- |
vicaloy/raft-theoric | ---
dataset_info:
features:
- name: completion
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 758117
num_examples: 397
download_size: 214341
dataset_size: 758117
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bourneyz/bonuy | ---
license: openrail
---
|
hjm0525/drawings | ---
license: other
license_name: connexverse
license_link: https://connexverse.com
---
|
CyberHarem/atlanta_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of atlanta/アトランタ/亚特兰大 (Azur Lane)
This is the dataset of atlanta/アトランタ/亚特兰大 (Azur Lane), containing 19 images and their tags.
The core tags of this character are `pink_hair, blue_eyes, braid, long_hair, ahoge, bangs, crown_braid, black_ribbon, hair_ribbon, ribbon, breasts, hair_ornament, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 16.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/atlanta_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 11.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/atlanta_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 37 | 20.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/atlanta_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 14.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/atlanta_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 37 | 24.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/atlanta_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/atlanta_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, fingerless_gloves, red_necktie, white_shirt, bare_shoulders, blue_skirt, pleated_skirt, white_thighhighs, blush, simple_background, single_thighhigh, white_background, detached_collar, miniskirt, detached_sleeves, off-shoulder_shirt, open_mouth, smile, red_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | fingerless_gloves | red_necktie | white_shirt | bare_shoulders | blue_skirt | pleated_skirt | white_thighhighs | blush | simple_background | single_thighhigh | white_background | detached_collar | miniskirt | detached_sleeves | off-shoulder_shirt | open_mouth | smile | red_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------------|:--------------|:--------------|:-----------------|:-------------|:----------------|:-------------------|:--------|:--------------------|:-------------------|:-------------------|:------------------|:------------|:-------------------|:---------------------|:-------------|:--------|:-------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
77asadian/FAQ_ds | ---
license: mit
---
|
pkuHaowei/cub-200-2011-birds | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 366782244.5
num_examples: 11788
download_size: 365907536
dataset_size: 366782244.5
---
# Dataset Card for "cub-200-2011-birds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xusenlin/clue-ner | ---
dataset_info:
features:
- name: text
dtype: string
- name: entities
list:
- name: id
dtype: int64
- name: entity
dtype: string
- name: start_offset
dtype: int64
- name: end_offset
dtype: int64
- name: label
dtype: string
splits:
- name: train
num_bytes: 2443356
num_examples: 10748
- name: test
num_bytes: 154492
num_examples: 1345
- name: validation
num_bytes: 309106
num_examples: 1343
download_size: 1658426
dataset_size: 2906954
language:
- zh
tags:
- named entity recognition
- clue
license: apache-2.0
---
# CLUE-NER 命名实体识别数据集
字段说明
+ `text`: 文本
+ `entities`: 文本中包含的实体
+ `id`: 实体 `id`
+ `entity`: 实体对应的字符串
+ `start_offset`: 实体开始位置
+ `end_offset`: 实体结束位置的下一位
+ `label`: 实体对应的开始位置
|
yjg30737/onepiece-characters | ---
license: mit
task_categories:
- table-question-answering
language:
- en
- ja
- ko
size_categories:
- 100K<n<1M
---
# onepiece-character
This is a dataset created from crawling the One Piece Fandom on 2023-07-08. |
center-for-humans-and-machines/style-diffusion | ---
dataset_info:
features:
- name: vectorId
dtype: string
- name: medianYear
dtype: int32
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 3448928
num_examples: 1113
download_size: 0
dataset_size: 3448928
---
# Dataset Card for "style-diffusion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
myradeng/diffusion_db_dedup_from5k_val_v2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: seed
dtype: uint32
- name: step
dtype: uint16
- name: cfg
dtype: float32
- name: sampler
dtype: string
- name: width
dtype: uint16
- name: height
dtype: uint16
- name: user_name
dtype: string
- name: timestamp
dtype: timestamp[ns, tz=UTC]
- name: image_nsfw
dtype: float32
- name: prompt_nsfw
dtype: float32
- name: __index_level_0__
dtype: int64
- name: image_path
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 541341423.6482127
num_examples: 918
download_size: 541203333
dataset_size: 541341423.6482127
---
# Dataset Card for "diffusion_db_dedup_from5k_val_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Maaciek/test | ---
license: cc-by-nc-4.0
---
|
aliamdubsh/hand_drawn | ---
license: mit
---
|
AbhayaHanuma/mini-samsum | ---
dataset_info:
features:
- name: dialogue
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 1755840
num_examples: 1000
download_size: 1063943
dataset_size: 1755840
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
patrickvonplaten/restore_punctuation_medium_num_beams_2 | ---
tags:
- speechbox_punc
--- |
hotal/honeypot_logs | ---
dataset_info:
features:
- name: system
dtype: string
- name: command
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2370026
num_examples: 5631
download_size: 227122
dataset_size: 2370026
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "honeypot_logs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atmallen/quirky_addition_increment3_alice_hard | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 1636460.2815
num_examples: 24225
- name: validation
num_bytes: 160122.8088
num_examples: 2372
- name: test
num_bytes: 164309.7354
num_examples: 2433
download_size: 623109
dataset_size: 1960892.8257
---
# Dataset Card for "quirky_addition_increment3_alice_hard"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AIBreeding/PhylogeneticProfiling | ---
license: apache-2.0
---
|
havens2/apitext | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6472042
num_examples: 8830
download_size: 2694540
dataset_size: 6472042
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "apitext"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zuleo/aubrey-plaza | ---
license: creativeml-openrail-m
tags:
- stable-diffusion
- embedding
- textual-inversion
- text-to-image
- image-to-image
- art
- artistic
---
# Aubrey Plaza textual inversion
This is an embedding of the amazing Aubrey Plaza.
## Version 2:

## Version 1:

## Embedding Usage
Use the token ```aubreyplazav2-300```
### Previous versions:
| Token | Version |
|----------------------|------------------------|
| `aubreyplazav2-300` | Version 2 - 300 steps |
| `aubreyplazav1-7375` | Version 1 - 7375 steps |

---
## 🎶 Prompt Examples
🧾 ```Perfectly-centered close up portrait-photograph of a real life warrior aubreyplazav2-300, hair flowing in the wind with beautiful bright blue eyes, (wearing gold and white armor and big hoop gold earrings and a tiara:1.22223), (battle axe and broad sword hanging from her belt:1.112), standing near a rain forest with a waterfall, lifelike, super highly detailed, professional digital painting, artstation, concept art, Photorealism, HD quality, 8k resolution, beautiful, cinematic, art by artgerm and greg rutkowski and alphonse mucha and loish and WLOP```
⛔ Negative prompt: ```(bad_prompt_version2:0.8), ((((ugly)))), (((duplicate))), ((morbid)), ((mutilated)), [out of frame], extra fingers, mutated hands, ((poorly drawn hands)), ((poorly drawn face)), (((mutation))), (((deformed))), ((ugly)), blurry, ((bad anatomy)), (((bad proportions))), ((extra limbs)), cloned face, (((disfigured))), out of frame, ugly, extra limbs, (bad anatomy), gross proportions, (malformed limbs), ((missing arms)), ((missing legs)), (((extra arms))), (((extra legs))), mutated hands, (fused fingers), (too many fingers), (((long neck))), watermark, signature, words, (text:1.4), cross eyed```
_Steps: 20, Sampler: DPM++ 2S a Karras, CFG scale: 7, Seed: 3960559569, Size: 512x512, Model hash: 67abd65708_
---
🧾 ```photorealistic painting ((full body)) portrait of ((stunningly attractive)) a aubreyplazav2-300 at a bar, ((perfect feminine face)), (+long colorful wavy hair), (+glitter freckles), glitter, wearing a dress, intricate, 8k, highly detailed, volumetric lighting, digital painting, intense, sharp focus, art by artgerm and rutkowski and alphonse mucha, cgsociety```
⛔ Negative prompt: ```(bad_prompt_version2:0.7), ((((ugly)))), (((duplicate))), ((morbid)), ((mutilated)), [out of frame], ((poorly drawn eyes)), extra fingers, ((poorly drawn face)), (((mutation))), (((deformed))), ((ugly)), blurry, ((bad anatomy)), ((extra limbs)), cloned face, (((disfigured))), out of frame, extra limbs, (bad anatomy), gross proportions, (malformed limbs), ((missing arms)), ((missing legs)), (((extra arms))), (((extra legs))), (fused fingers), (too many fingers), (((long neck)))```
_Steps: 36, Sampler: DPM++ 2S a Karras, CFG scale: 7, Seed: 788010516, Size: 512x512, Model hash: 67abd65708_
---
🧾 ```Perfectly-centered close up portrait-photograph of a real life sexy aubreyplazav2-300, hair flowing in the wind with (beautiful bright green eyes:1.2), (wearing a purple shirt and big hoop silver earrings and a green tiara:1.22223), standing near a twisting stairwell, lifelike, subsurface scattering, super highly detailed, professional digital painting, artstation, concept art, Photorealism, HD quality, 8k resolution, beautiful, cinematic, art by artgerm and greg rutkowski and alphonse mucha and loish and WLOP```
⛔ Negative prompt: ```(bad_prompt_version2:0.8), ((((ugly)))), (((duplicate))), ((morbid)), ((mutilated)), [out of frame], extra fingers, mutated hands, ((poorly drawn hands)), ((poorly drawn face)), (((mutation))), (((deformed))), ((ugly)), blurry, ((bad anatomy)), (((bad proportions))), ((extra limbs)), cloned face, (((disfigured))), out of frame, ugly, extra limbs, (bad anatomy), gross proportions, (malformed limbs), ((missing arms)), ((missing legs)), (((extra arms))), (((extra legs))), mutated hands, (fused fingers), (too many fingers), (((long neck))), watermark, signature, words, (text:1.4), cross eyed```
_Steps: 24, Sampler: DPM++ 2S a, CFG scale: 7.5, Seed: 4119437875, Size: 512x768, Model hash: d8691b4d16_
---
## 🎴 text2img Sampler and Checkpoint grids:
It's always great to get a visual of what's going on with sampler using different models with this embedding. See the examples below and tune them to your liking.
[Sampling Grid](https://huggingface.co/datasets/zuleo/aubrey-plaza/resolve/main/images/sampler_ckpt_grid.png)
---
☕ If you enjoy this model, buy me a coffee [](https://ko-fi.com/3eegames)
--- |
TracyMc/testdataset | ---
license: mit
---
### 任务介绍
Fin-Eval涵盖了财富管理、保险、投资研究等多个金融领域,是专为金融领域大模型而构建的评估数据集,涵盖认知、生成、金融知识、金融逻辑以及安全合规五大类能力共28个子任务。在设计任务时,充分考虑到大模型在In-Context Learning、工具调用、CoT等方面的特点。
### 数据读取
```python
from datasets import load_dataset
dataset=load_dataset("TracyMc/testdataset", name="test")
print(dataset["test"][0])
# {'id': 1, '大类': '认知', '任务': '金融意图理解', '问题': '近期美元汇率有没有大幅波动', '答案': '行情解读', '解释': None}
```
更多数据集使用细节和评估方法见[github page](https://github.com)
如果需要全量数据集,请发送邮件申请授权:联系邮箱 linchenxiao.xlc@antgroup.com。 |
pranjali97/ha-en_RL-grow2_I2_valid | ---
dataset_info:
features:
- name: src
dtype: string
- name: ref
dtype: string
- name: mt
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 1427995
num_examples: 3339
download_size: 378938
dataset_size: 1427995
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ha-en_RL-grow2_I2_valid"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/vanilla_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of vanilla/バニラ/香草 (Arknights)
This is the dataset of vanilla/バニラ/香草 (Arknights), containing 56 images and their tags.
The core tags of this character are `horns, pointy_ears, short_hair, red_eyes, blonde_hair, hair_ornament, hairclip, tail, dragon_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 56 | 73.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vanilla_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 56 | 61.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vanilla_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 128 | 115.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vanilla_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/vanilla_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, black_thighhighs, long_sleeves, open_jacket, solo, looking_at_viewer, simple_background, white_background, green_jacket, dragon_girl, standing, black_dress, black_footwear, breasts, holding_weapon, black_skirt, closed_mouth, dragon_horns, full_body, id_card, shoes, holding_axe |
| 1 | 14 |  |  |  |  |  | 1girl, solo, green_jacket, open_jacket, upper_body, looking_at_viewer, black_shirt, infection_monitor_(arknights), long_sleeves, simple_background, open_mouth, smile, blush, choker, collarbone, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_thighhighs | long_sleeves | open_jacket | solo | looking_at_viewer | simple_background | white_background | green_jacket | dragon_girl | standing | black_dress | black_footwear | breasts | holding_weapon | black_skirt | closed_mouth | dragon_horns | full_body | id_card | shoes | holding_axe | upper_body | black_shirt | infection_monitor_(arknights) | open_mouth | smile | blush | choker | collarbone |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:---------------|:--------------|:-------|:--------------------|:--------------------|:-------------------|:---------------|:--------------|:-----------|:--------------|:-----------------|:----------|:-----------------|:--------------|:---------------|:---------------|:------------|:----------|:--------|:--------------|:-------------|:--------------|:--------------------------------|:-------------|:--------|:--------|:---------|:-------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
distilled-from-one-sec-cv12/chunk_58 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1311169548
num_examples: 255489
download_size: 1334380090
dataset_size: 1311169548
---
# Dataset Card for "chunk_58"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pdebench/Burgers | ---
license: cc-by-4.0
---
|
sethapun/arithmetic_2all_1to5 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: float64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 54000
num_examples: 2000
- name: validation
num_bytes: 10800
num_examples: 400
download_size: 10946
dataset_size: 64800
---
# Dataset Card for "arithmetic_2all_1to5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713183106 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 7047
num_examples: 18
download_size: 10498
dataset_size: 7047
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713183106"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-c9cce3-2280272258 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b1
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b1
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
open-llm-leaderboard/details_ehartford__Samantha-1.11-7b | ---
pretty_name: Evaluation run of ehartford/Samantha-1.11-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Samantha-1.11-7b](https://huggingface.co/ehartford/Samantha-1.11-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Samantha-1.11-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T04:25:39.481995](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-7b/blob/main/results_2023-10-18T04-25-39.481995.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931188775,\n \"f1\": 0.060975251677852296,\n\
\ \"f1_stderr\": 0.0013628501994356545,\n \"acc\": 0.40696714224080927,\n\
\ \"acc_stderr\": 0.00970971340875476\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931188775,\n\
\ \"f1\": 0.060975251677852296,\n \"f1_stderr\": 0.0013628501994356545\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07202426080363912,\n \
\ \"acc_stderr\": 0.007121147983537128\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.01229827883397239\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/Samantha-1.11-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|arc:challenge|25_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T04_25_39.481995
path:
- '**/details_harness|drop|3_2023-10-18T04-25-39.481995.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T04-25-39.481995.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T04_25_39.481995
path:
- '**/details_harness|gsm8k|5_2023-10-18T04-25-39.481995.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T04-25-39.481995.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hellaswag|10_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T04_25_39.481995
path:
- '**/details_harness|winogrande|5_2023-10-18T04-25-39.481995.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T04-25-39.481995.parquet'
- config_name: results
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- results_2023-08-25T14:45:21.657251.parquet
- split: 2023_10_18T04_25_39.481995
path:
- results_2023-10-18T04-25-39.481995.parquet
- split: latest
path:
- results_2023-10-18T04-25-39.481995.parquet
---
# Dataset Card for Evaluation run of ehartford/Samantha-1.11-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Samantha-1.11-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Samantha-1.11-7b](https://huggingface.co/ehartford/Samantha-1.11-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Samantha-1.11-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T04:25:39.481995](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-7b/blob/main/results_2023-10-18T04-25-39.481995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931188775,
"f1": 0.060975251677852296,
"f1_stderr": 0.0013628501994356545,
"acc": 0.40696714224080927,
"acc_stderr": 0.00970971340875476
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931188775,
"f1": 0.060975251677852296,
"f1_stderr": 0.0013628501994356545
},
"harness|gsm8k|5": {
"acc": 0.07202426080363912,
"acc_stderr": 0.007121147983537128
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.01229827883397239
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Abhaykoul__Qwen1.5-0.5B-vortex-v2 | ---
pretty_name: Evaluation run of Abhaykoul/Qwen1.5-0.5B-vortex-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Abhaykoul/Qwen1.5-0.5B-vortex-v2](https://huggingface.co/Abhaykoul/Qwen1.5-0.5B-vortex-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Abhaykoul__Qwen1.5-0.5B-vortex-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T15:50:12.323046](https://huggingface.co/datasets/open-llm-leaderboard/details_Abhaykoul__Qwen1.5-0.5B-vortex-v2/blob/main/results_2024-03-11T15-50-12.323046.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3597331970616422,\n\
\ \"acc_stderr\": 0.03388517744885889,\n \"acc_norm\": 0.3635416873826628,\n\
\ \"acc_norm_stderr\": 0.03469794845518048,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766375,\n \"mc2\": 0.4429283527301926,\n\
\ \"mc2_stderr\": 0.014852198844725878\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2721843003412969,\n \"acc_stderr\": 0.013006600406423704,\n\
\ \"acc_norm\": 0.30631399317406144,\n \"acc_norm_stderr\": 0.013470584417276511\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3647679745070703,\n\
\ \"acc_stderr\": 0.0048038126319949696,\n \"acc_norm\": 0.4553873730332603,\n\
\ \"acc_norm_stderr\": 0.0049698795328430925\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740234,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740234\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.35094339622641507,\n \"acc_stderr\": 0.029373646253234686,\n\
\ \"acc_norm\": 0.35094339622641507,\n \"acc_norm_stderr\": 0.029373646253234686\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.3468208092485549,\n \"acc_stderr\": 0.036291466701596636,\n\
\ \"acc_norm\": 0.3468208092485549,\n \"acc_norm_stderr\": 0.036291466701596636\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n\
\ \"acc_stderr\": 0.047240073523838876,\n \"acc_norm\": 0.3431372549019608,\n\
\ \"acc_norm_stderr\": 0.047240073523838876\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n\
\ \"acc_stderr\": 0.028809989854102967,\n \"acc_norm\": 0.26382978723404255,\n\
\ \"acc_norm_stderr\": 0.028809989854102967\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.041857744240220554,\n\
\ \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.041857744240220554\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\"\
: 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2804232804232804,\n\
\ \"acc_stderr\": 0.02313528797432563,\n \"acc_norm\": 0.2804232804232804,\n\
\ \"acc_norm_stderr\": 0.02313528797432563\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.038932596106046755,\n\
\ \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.038932596106046755\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.3935483870967742,\n \"acc_stderr\": 0.027791878753132274,\n\
\ \"acc_norm\": 0.3935483870967742,\n \"acc_norm_stderr\": 0.027791878753132274\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n \"\
acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.0390369864774844,\n\
\ \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.0390369864774844\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.43523316062176165,\n \"acc_stderr\": 0.03578038165008585,\n\
\ \"acc_norm\": 0.43523316062176165,\n \"acc_norm_stderr\": 0.03578038165008585\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.024468615241478923,\n\
\ \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.024468615241478923\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36554621848739494,\n \"acc_stderr\": 0.03128217706368461,\n\
\ \"acc_norm\": 0.36554621848739494,\n \"acc_norm_stderr\": 0.03128217706368461\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4073394495412844,\n \"acc_stderr\": 0.021065986244412877,\n \"\
acc_norm\": 0.4073394495412844,\n \"acc_norm_stderr\": 0.021065986244412877\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4362745098039216,\n \"acc_stderr\": 0.03480693138457038,\n \"\
acc_norm\": 0.4362745098039216,\n \"acc_norm_stderr\": 0.03480693138457038\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5232067510548524,\n \"acc_stderr\": 0.03251215201141018,\n \
\ \"acc_norm\": 0.5232067510548524,\n \"acc_norm_stderr\": 0.03251215201141018\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.24663677130044842,\n\
\ \"acc_stderr\": 0.028930413120910874,\n \"acc_norm\": 0.24663677130044842,\n\
\ \"acc_norm_stderr\": 0.028930413120910874\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5289256198347108,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.5289256198347108,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4658119658119658,\n\
\ \"acc_stderr\": 0.03267942734081228,\n \"acc_norm\": 0.4658119658119658,\n\
\ \"acc_norm_stderr\": 0.03267942734081228\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.334610472541507,\n\
\ \"acc_stderr\": 0.016873468641592157,\n \"acc_norm\": 0.334610472541507,\n\
\ \"acc_norm_stderr\": 0.016873468641592157\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.42485549132947975,\n \"acc_stderr\": 0.026613350840261736,\n\
\ \"acc_norm\": 0.42485549132947975,\n \"acc_norm_stderr\": 0.026613350840261736\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\
\ \"acc_stderr\": 0.014572650383409153,\n \"acc_norm\": 0.2547486033519553,\n\
\ \"acc_norm_stderr\": 0.014572650383409153\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.028332397483664274,\n\
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.028332397483664274\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3858520900321543,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.3858520900321543,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.35802469135802467,\n \"acc_stderr\": 0.026675611926037086,\n\
\ \"acc_norm\": 0.35802469135802467,\n \"acc_norm_stderr\": 0.026675611926037086\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880585,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880585\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30378096479791394,\n\
\ \"acc_stderr\": 0.011745787720472472,\n \"acc_norm\": 0.30378096479791394,\n\
\ \"acc_norm_stderr\": 0.011745787720472472\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.30392156862745096,\n \"acc_stderr\": 0.018607552131279834,\n \
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.018607552131279834\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n\
\ \"acc_stderr\": 0.047093069786618966,\n \"acc_norm\": 0.4090909090909091,\n\
\ \"acc_norm_stderr\": 0.047093069786618966\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.028920583220675602,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.028920583220675602\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5074626865671642,\n\
\ \"acc_stderr\": 0.03535140084276719,\n \"acc_norm\": 0.5074626865671642,\n\
\ \"acc_norm_stderr\": 0.03535140084276719\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766375,\n \"mc2\": 0.4429283527301926,\n\
\ \"mc2_stderr\": 0.014852198844725878\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5603788476716653,\n \"acc_stderr\": 0.013949649776015684\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05913570887035633,\n \
\ \"acc_stderr\": 0.006497266660428841\n }\n}\n```"
repo_url: https://huggingface.co/Abhaykoul/Qwen1.5-0.5B-vortex-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|arc:challenge|25_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|gsm8k|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hellaswag|10_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T15-50-12.323046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T15-50-12.323046.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- '**/details_harness|winogrande|5_2024-03-11T15-50-12.323046.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T15-50-12.323046.parquet'
- config_name: results
data_files:
- split: 2024_03_11T15_50_12.323046
path:
- results_2024-03-11T15-50-12.323046.parquet
- split: latest
path:
- results_2024-03-11T15-50-12.323046.parquet
---
# Dataset Card for Evaluation run of Abhaykoul/Qwen1.5-0.5B-vortex-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Abhaykoul/Qwen1.5-0.5B-vortex-v2](https://huggingface.co/Abhaykoul/Qwen1.5-0.5B-vortex-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Abhaykoul__Qwen1.5-0.5B-vortex-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T15:50:12.323046](https://huggingface.co/datasets/open-llm-leaderboard/details_Abhaykoul__Qwen1.5-0.5B-vortex-v2/blob/main/results_2024-03-11T15-50-12.323046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3597331970616422,
"acc_stderr": 0.03388517744885889,
"acc_norm": 0.3635416873826628,
"acc_norm_stderr": 0.03469794845518048,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766375,
"mc2": 0.4429283527301926,
"mc2_stderr": 0.014852198844725878
},
"harness|arc:challenge|25": {
"acc": 0.2721843003412969,
"acc_stderr": 0.013006600406423704,
"acc_norm": 0.30631399317406144,
"acc_norm_stderr": 0.013470584417276511
},
"harness|hellaswag|10": {
"acc": 0.3647679745070703,
"acc_stderr": 0.0048038126319949696,
"acc_norm": 0.4553873730332603,
"acc_norm_stderr": 0.0049698795328430925
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740234,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740234
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.35094339622641507,
"acc_stderr": 0.029373646253234686,
"acc_norm": 0.35094339622641507,
"acc_norm_stderr": 0.029373646253234686
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102967,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102967
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220554,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220554
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432563,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432563
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046755,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046755
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3935483870967742,
"acc_stderr": 0.027791878753132274,
"acc_norm": 0.3935483870967742,
"acc_norm_stderr": 0.027791878753132274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.0390369864774844,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.0390369864774844
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.43523316062176165,
"acc_stderr": 0.03578038165008585,
"acc_norm": 0.43523316062176165,
"acc_norm_stderr": 0.03578038165008585
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36923076923076925,
"acc_stderr": 0.024468615241478923,
"acc_norm": 0.36923076923076925,
"acc_norm_stderr": 0.024468615241478923
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36554621848739494,
"acc_stderr": 0.03128217706368461,
"acc_norm": 0.36554621848739494,
"acc_norm_stderr": 0.03128217706368461
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4073394495412844,
"acc_stderr": 0.021065986244412877,
"acc_norm": 0.4073394495412844,
"acc_norm_stderr": 0.021065986244412877
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4362745098039216,
"acc_stderr": 0.03480693138457038,
"acc_norm": 0.4362745098039216,
"acc_norm_stderr": 0.03480693138457038
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5232067510548524,
"acc_stderr": 0.03251215201141018,
"acc_norm": 0.5232067510548524,
"acc_norm_stderr": 0.03251215201141018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.24663677130044842,
"acc_stderr": 0.028930413120910874,
"acc_norm": 0.24663677130044842,
"acc_norm_stderr": 0.028930413120910874
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5289256198347108,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.5289256198347108,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4658119658119658,
"acc_stderr": 0.03267942734081228,
"acc_norm": 0.4658119658119658,
"acc_norm_stderr": 0.03267942734081228
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.334610472541507,
"acc_stderr": 0.016873468641592157,
"acc_norm": 0.334610472541507,
"acc_norm_stderr": 0.016873468641592157
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.42485549132947975,
"acc_stderr": 0.026613350840261736,
"acc_norm": 0.42485549132947975,
"acc_norm_stderr": 0.026613350840261736
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409153,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409153
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.028332397483664274,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.028332397483664274
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3858520900321543,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.3858520900321543,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.35802469135802467,
"acc_stderr": 0.026675611926037086,
"acc_norm": 0.35802469135802467,
"acc_norm_stderr": 0.026675611926037086
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880585,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880585
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30378096479791394,
"acc_stderr": 0.011745787720472472,
"acc_norm": 0.30378096479791394,
"acc_norm_stderr": 0.011745787720472472
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.018607552131279834,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.018607552131279834
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.047093069786618966,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.047093069786618966
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.028920583220675602,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.028920583220675602
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5074626865671642,
"acc_stderr": 0.03535140084276719,
"acc_norm": 0.5074626865671642,
"acc_norm_stderr": 0.03535140084276719
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.03696584317010601,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.03696584317010601
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766375,
"mc2": 0.4429283527301926,
"mc2_stderr": 0.014852198844725878
},
"harness|winogrande|5": {
"acc": 0.5603788476716653,
"acc_stderr": 0.013949649776015684
},
"harness|gsm8k|5": {
"acc": 0.05913570887035633,
"acc_stderr": 0.006497266660428841
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/emile_bertin_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of emile_bertin/エミール・ベルタン/埃米尔·贝尔汀 (Azur Lane)
This is the dataset of emile_bertin/エミール・ベルタン/埃米尔·贝尔汀 (Azur Lane), containing 40 images and their tags.
The core tags of this character are `blonde_hair, breasts, long_hair, blue_eyes, bow, hair_bow, blue_bow, bangs, large_breasts, very_long_hair, wavy_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 48.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emile_bertin_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 31.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emile_bertin_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 93 | 62.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emile_bertin_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 44.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emile_bertin_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 93 | 82.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emile_bertin_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/emile_bertin_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, cleavage, smile, white_gloves, one-piece_swimsuit, armlet, frills, simple_background, white_background, blush, choker, closed_mouth, covered_navel, cross, halterneck, sitting, earrings, open_mouth, thighs |
| 1 | 13 |  |  |  |  |  | 1girl, blush, smile, white_gloves, looking_at_viewer, solo, maid_headdress, blue_dress, cleavage, detached_collar, apron, open_mouth, puffy_short_sleeves, striped_thighhighs, white_panties, asymmetrical_bangs, blue_thighhighs, bowtie, frilled_dress, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | bare_shoulders | cleavage | smile | white_gloves | one-piece_swimsuit | armlet | frills | simple_background | white_background | blush | choker | closed_mouth | covered_navel | cross | halterneck | sitting | earrings | open_mouth | thighs | maid_headdress | blue_dress | detached_collar | apron | puffy_short_sleeves | striped_thighhighs | white_panties | asymmetrical_bangs | blue_thighhighs | bowtie | frilled_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------------|:-----------|:--------|:---------------|:---------------------|:---------|:---------|:--------------------|:-------------------|:--------|:---------|:---------------|:----------------|:--------|:-------------|:----------|:-----------|:-------------|:---------|:-----------------|:-------------|:------------------|:--------|:----------------------|:---------------------|:----------------|:---------------------|:------------------|:---------|:----------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | | X | X | X | | | | | | X | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X |
|
flax-community/german-common-voice-processed | ---
language:
- de
--- |
arthurmluz/xlsum_data-xlsum_cstnews_1024_results | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 28015420
num_examples: 7175
download_size: 17143190
dataset_size: 28015420
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "xlsum_data-xlsumm_cstnews_1024_results"
rouge= {'rouge1': 0.2759461774292122, 'rouge2': 0.09432308277095043, 'rougeL': 0.18435043787434557, 'rougeLsum': 0.18435043787434557}
bert= {'precision': 0.6937206263243114, 'recall': 0.7438775094544016, 'f1': 0.7174445173050884}
mover = 0.5946254344345815 |
codefuse-ai/CodeFuse-DevOps-Eval | ---
license: mit
language:
- en
- zh
tags:
- devops
- aiops
- llm
pretty_name: DevOps-Eval
size_categories:
- n<1K
task_categories:
- question-answering
- multiple-choice
---
DevOps-Eval is a comprehensive chinese evaluation suite specifically designed for foundation models in the DevOps field. It consists of 5977 multi-choice questions spanning 55 diverse categories. Please visit our website and [GitHub](https://github.com/codefuse-ai/codefuse-devops-eval) for more details.
Each category consists of two splits: dev, and test. The dev set per subject consists of five exemplars with explanations for few-shot evaluation. And the test set is for model evaluation. Labels on the test split are released, users can evaluate their results and automatically obtain test accuracy. [How to evaluate](https://github.com/codefuse-ai/codefuse-devops-eval#-how-to-evaluate)?
### Load the data
``` python
from datasets import load_dataset
dataset=load_dataset(r"devopseval-exam",name="UnitTesting")
print(dataset['val'][0])
# {"id": 1, "question": "单元测试应该覆盖以下哪些方面?", "A": "正常路径", "B": "异常路径", "C": "边界值条件","D": 所有以上,"answer": "D", "explanation": ""} ```
```
#### Notes
More details on loading and using the data are at our github page [github](https://github.com/codefuse-ai/codefuse-devops-eval) page. |
CyberHarem/kamitsure_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kamitsure/カミツレ (Pokémon)
This is the dataset of kamitsure/カミツレ (Pokémon), containing 500 images and their tags.
The core tags of this character are `headphones, blue_eyes, breasts, blonde_hair, short_hair, bangs, blunt_bangs, black_hair, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 449.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamitsure_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 298.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamitsure_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1027 | 553.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamitsure_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 413.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamitsure_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1027 | 721.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamitsure_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kamitsure_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, sidelocks, yellow_jacket, blush, short_hair_with_long_locks, hand_up, solo, cleavage, closed_mouth, smile, collarbone, sitting, sleeveless, watermark |
| 1 | 5 |  |  |  |  |  | 1girl, fur_coat, smile, solo, looking_at_viewer, large_breasts, cleavage |
| 2 | 7 |  |  |  |  |  | 1girl, solo, midriff, fur_coat, navel, smile, nail_polish, very_long_hair |
| 3 | 15 |  |  |  |  |  | 1girl, solo, holding_poke_ball, poke_ball_(basic), fur_coat, nail_polish, midriff, looking_at_viewer, shorts, smile |
| 4 | 21 |  |  |  |  |  | 1girl, navel, solo, holding_poke_ball, poke_ball_(basic), bare_shoulders, choker, black_pantyhose, cleavage, high_heels |
| 5 | 11 |  |  |  |  |  | 1girl, solo, bare_shoulders, choker, smile, black_pantyhose, open_mouth, sitting |
| 6 | 21 |  |  |  |  |  | 1girl, bare_arms, black_choker, yellow_dress, black_pantyhose, short_dress, bare_shoulders, collarbone, looking_at_viewer, solo, sleeveless_dress, yellow_skirt, closed_mouth, black_headwear, cable |
| 7 | 9 |  |  |  |  |  | 1girl, blush, large_breasts, nipples, solo, choker, huge_breasts |
| 8 | 16 |  |  |  |  |  | 1girl, hetero, nipples, penis, 1boy, blush, solo_focus, vaginal, censored, cum_in_pussy, spread_legs, large_breasts, pantyhose, sex_from_behind, sweat, medium_breasts, navel, pubic_hair, straddling, torn_clothes |
| 9 | 8 |  |  |  |  |  | nipples, 1boy, 1girl, hetero, navel, penis, pussy, sex, vaginal, blush, looking_at_viewer, open_mouth, spread_legs, completely_nude, mosaic_censoring, arms_up, sweat, armpits, collarbone, on_back, pov |
| 10 | 5 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, official_alternate_costume, black_shorts, blush, earmuffs, eyelashes, red_scarf, solo, hand_up, nail_polish, open_coat, :d, black_footwear, black_nails, boots, closed_mouth, holding, open_mouth, simple_background, sitting, twintails, white_background, white_coat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | sidelocks | yellow_jacket | blush | short_hair_with_long_locks | hand_up | solo | cleavage | closed_mouth | smile | collarbone | sitting | sleeveless | watermark | fur_coat | large_breasts | midriff | navel | nail_polish | very_long_hair | holding_poke_ball | poke_ball_(basic) | shorts | bare_shoulders | choker | black_pantyhose | high_heels | open_mouth | bare_arms | black_choker | yellow_dress | short_dress | sleeveless_dress | yellow_skirt | black_headwear | cable | nipples | huge_breasts | hetero | penis | 1boy | solo_focus | vaginal | censored | cum_in_pussy | spread_legs | pantyhose | sex_from_behind | sweat | medium_breasts | pubic_hair | straddling | torn_clothes | pussy | sex | completely_nude | mosaic_censoring | arms_up | armpits | on_back | pov | long_sleeves | official_alternate_costume | black_shorts | earmuffs | eyelashes | red_scarf | open_coat | :d | black_footwear | black_nails | boots | holding | simple_background | twintails | white_background | white_coat |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------------|:------------|:----------------|:--------|:-----------------------------|:----------|:-------|:-----------|:---------------|:--------|:-------------|:----------|:-------------|:------------|:-----------|:----------------|:----------|:--------|:--------------|:-----------------|:--------------------|:--------------------|:---------|:-----------------|:---------|:------------------|:-------------|:-------------|:------------|:---------------|:---------------|:--------------|:-------------------|:---------------|:-----------------|:--------|:----------|:---------------|:---------|:--------|:-------|:-------------|:----------|:-----------|:---------------|:--------------|:------------|:------------------|:--------|:-----------------|:-------------|:-------------|:---------------|:--------|:------|:------------------|:-------------------|:----------|:----------|:----------|:------|:---------------|:-----------------------------|:---------------|:-----------|:------------|:------------|:------------|:-----|:-----------------|:--------------|:--------|:----------|:--------------------|:------------|:-------------------|:-------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | | | | | X | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | | | | | X | | | X | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | X | X | | | | | | X | | | X | | | | | X | | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 21 |  |  |  |  |  | X | | | | | | | X | X | | | | | | | | | | X | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | | | | | | | X | | | X | | X | | | | | | | | | | | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 21 |  |  |  |  |  | X | X | | | | | | X | | X | | X | | | | | | | | | | | | | X | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 16 |  |  |  |  |  | X | | | | X | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 8 |  |  |  |  |  | X | X | | | X | | | | | | | X | | | | | | | X | | | | | | | | | | X | | | | | | | | | X | | X | X | X | | X | | | X | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | | | X | | X | X | | X | | | X | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
d42me/opinions_qa_finetuning_small | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 305641291
num_examples: 721604
download_size: 30105846
dataset_size: 305641291
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SergioSCA/Logistics_v2_yolov8 | ---
license: apache-2.0
---
|
TheFinAI/flare-finred | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: text
dtype: string
- name: answer
dtype: string
- name: label
sequence: string
splits:
- name: test
num_bytes: 1521946
num_examples: 1068
download_size: 478837
dataset_size: 1521946
---
# Dataset Card for "flare-finred"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/ghost_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 20543905
num_examples: 100000
download_size: 320498
dataset_size: 20543905
---
# Dataset Card for "ghost_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/9208a1cc | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 174
num_examples: 10
download_size: 1342
dataset_size: 174
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "9208a1cc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Qwen__Qwen1.5-72B | ---
pretty_name: Evaluation run of Qwen/Qwen1.5-72B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Qwen/Qwen1.5-72B](https://huggingface.co/Qwen/Qwen1.5-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Qwen__Qwen1.5-72B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T21:24:58.616285](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen1.5-72B/blob/main/results_2024-02-18T21-24-58.616285.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7669743429877653,\n\
\ \"acc_stderr\": 0.027971495069922473,\n \"acc_norm\": 0.7715834368806984,\n\
\ \"acc_norm_stderr\": 0.028493498109494097,\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.017233299399571227,\n \"mc2\": 0.596080564321232,\n\
\ \"mc2_stderr\": 0.01451800985281567\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759095,\n\
\ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.01385583128749773\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6666998605855408,\n\
\ \"acc_stderr\": 0.004704293898729911,\n \"acc_norm\": 0.8598884684325832,\n\
\ \"acc_norm_stderr\": 0.003463933286063887\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.02564834125169361,\n\
\ \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.02564834125169361\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.023508739218846938,\n\
\ \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.023508739218846938\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n\
\ \"acc_stderr\": 0.024774516250440175,\n \"acc_norm\": 0.9027777777777778,\n\
\ \"acc_norm_stderr\": 0.024774516250440175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n\
\ \"acc_stderr\": 0.032424147574830975,\n \"acc_norm\": 0.7630057803468208,\n\
\ \"acc_norm_stderr\": 0.032424147574830975\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.049512182523962604,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.049512182523962604\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n\
\ \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8085106382978723,\n \"acc_stderr\": 0.025722149992637798,\n\
\ \"acc_norm\": 0.8085106382978723,\n \"acc_norm_stderr\": 0.025722149992637798\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.5877192982456141,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.03416520447747549,\n\
\ \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.03416520447747549\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6984126984126984,\n \"acc_stderr\": 0.0236369759961018,\n \"acc_norm\"\
: 0.6984126984126984,\n \"acc_norm_stderr\": 0.0236369759961018\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n\
\ \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n\
\ \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n\
\ \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n\
\ \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n\
\ \"acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \"acc_norm\"\
: 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.0182631054201995,\n \"acc_norm\"\
: 0.9292929292929293,\n \"acc_norm_stderr\": 0.0182631054201995\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n\
\ \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.01967163241310029,\n \
\ \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.01967163241310029\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4925925925925926,\n \"acc_stderr\": 0.030482192395191506,\n \
\ \"acc_norm\": 0.4925925925925926,\n \"acc_norm_stderr\": 0.030482192395191506\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.024044054940440488,\n\
\ \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.024044054940440488\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5761589403973509,\n \"acc_stderr\": 0.04034846678603396,\n \"\
acc_norm\": 0.5761589403973509,\n \"acc_norm_stderr\": 0.04034846678603396\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9302752293577982,\n \"acc_stderr\": 0.01091942641184862,\n \"\
acc_norm\": 0.9302752293577982,\n \"acc_norm_stderr\": 0.01091942641184862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6851851851851852,\n \"acc_stderr\": 0.0316746870682898,\n \"acc_norm\"\
: 0.6851851851851852,\n \"acc_norm_stderr\": 0.0316746870682898\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n\
\ \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n\
\ \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640273,\n\
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640273\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9090909090909091,\n \"acc_stderr\": 0.026243194054073892,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.026243194054073892\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n\
\ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6517857142857143,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.6517857142857143,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9157088122605364,\n\
\ \"acc_stderr\": 0.009934966499513786,\n \"acc_norm\": 0.9157088122605364,\n\
\ \"acc_norm_stderr\": 0.009934966499513786\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442265,\n\
\ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442265\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6346368715083799,\n\
\ \"acc_stderr\": 0.016104833880142302,\n \"acc_norm\": 0.6346368715083799,\n\
\ \"acc_norm_stderr\": 0.016104833880142302\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.01989943546353996,\n\
\ \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.01989943546353996\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8392282958199357,\n\
\ \"acc_stderr\": 0.020862388082391888,\n \"acc_norm\": 0.8392282958199357,\n\
\ \"acc_norm_stderr\": 0.020862388082391888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.0190615881815054,\n\
\ \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.0190615881815054\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6276595744680851,\n \"acc_stderr\": 0.028838921471251455,\n \
\ \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.028838921471251455\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6121251629726207,\n\
\ \"acc_stderr\": 0.012444998309675631,\n \"acc_norm\": 0.6121251629726207,\n\
\ \"acc_norm_stderr\": 0.012444998309675631\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654484,\n\
\ \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654484\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8137254901960784,\n \"acc_stderr\": 0.01575052628436337,\n \
\ \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.01575052628436337\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.024127463462650156,\n\
\ \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.024127463462650156\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594194,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594194\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.017233299399571227,\n \"mc2\": 0.596080564321232,\n\
\ \"mc2_stderr\": 0.01451800985281567\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363696\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6573161485974223,\n \
\ \"acc_stderr\": 0.013073030230827912\n }\n}\n```"
repo_url: https://huggingface.co/Qwen/Qwen1.5-72B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|arc:challenge|25_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|gsm8k|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hellaswag|10_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T21-24-58.616285.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T21-24-58.616285.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- '**/details_harness|winogrande|5_2024-02-18T21-24-58.616285.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T21-24-58.616285.parquet'
- config_name: results
data_files:
- split: 2024_02_18T21_24_58.616285
path:
- results_2024-02-18T21-24-58.616285.parquet
- split: latest
path:
- results_2024-02-18T21-24-58.616285.parquet
---
# Dataset Card for Evaluation run of Qwen/Qwen1.5-72B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Qwen/Qwen1.5-72B](https://huggingface.co/Qwen/Qwen1.5-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Qwen__Qwen1.5-72B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T21:24:58.616285](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen1.5-72B/blob/main/results_2024-02-18T21-24-58.616285.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7669743429877653,
"acc_stderr": 0.027971495069922473,
"acc_norm": 0.7715834368806984,
"acc_norm_stderr": 0.028493498109494097,
"mc1": 0.412484700122399,
"mc1_stderr": 0.017233299399571227,
"mc2": 0.596080564321232,
"mc2_stderr": 0.01451800985281567
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759095,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.01385583128749773
},
"harness|hellaswag|10": {
"acc": 0.6666998605855408,
"acc_stderr": 0.004704293898729911,
"acc_norm": 0.8598884684325832,
"acc_norm_stderr": 0.003463933286063887
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.02564834125169361,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.02564834125169361
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.023508739218846938,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.023508739218846938
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440175,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.032424147574830975,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.032424147574830975
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.049512182523962604,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.049512182523962604
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8085106382978723,
"acc_stderr": 0.025722149992637798,
"acc_norm": 0.8085106382978723,
"acc_norm_stderr": 0.025722149992637798
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.03416520447747549,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.03416520447747549
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6984126984126984,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.6984126984126984,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6600985221674877,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.6600985221674877,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.0182631054201995,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.0182631054201995
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792194,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792194
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8153846153846154,
"acc_stderr": 0.01967163241310029,
"acc_norm": 0.8153846153846154,
"acc_norm_stderr": 0.01967163241310029
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4925925925925926,
"acc_stderr": 0.030482192395191506,
"acc_norm": 0.4925925925925926,
"acc_norm_stderr": 0.030482192395191506
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.024044054940440488,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.024044054940440488
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5761589403973509,
"acc_stderr": 0.04034846678603396,
"acc_norm": 0.5761589403973509,
"acc_norm_stderr": 0.04034846678603396
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9302752293577982,
"acc_stderr": 0.01091942641184862,
"acc_norm": 0.9302752293577982,
"acc_norm_stderr": 0.01091942641184862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.0316746870682898,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.0316746870682898
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640273,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640273
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.02871877688934232,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.02871877688934232
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.026243194054073892,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.026243194054073892
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6517857142857143,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.6517857142857143,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9157088122605364,
"acc_stderr": 0.009934966499513786,
"acc_norm": 0.9157088122605364,
"acc_norm_stderr": 0.009934966499513786
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442265,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442265
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6346368715083799,
"acc_stderr": 0.016104833880142302,
"acc_norm": 0.6346368715083799,
"acc_norm_stderr": 0.016104833880142302
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8594771241830066,
"acc_stderr": 0.01989943546353996,
"acc_norm": 0.8594771241830066,
"acc_norm_stderr": 0.01989943546353996
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8392282958199357,
"acc_stderr": 0.020862388082391888,
"acc_norm": 0.8392282958199357,
"acc_norm_stderr": 0.020862388082391888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.0190615881815054,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.0190615881815054
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6276595744680851,
"acc_stderr": 0.028838921471251455,
"acc_norm": 0.6276595744680851,
"acc_norm_stderr": 0.028838921471251455
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6121251629726207,
"acc_stderr": 0.012444998309675631,
"acc_norm": 0.6121251629726207,
"acc_norm_stderr": 0.012444998309675631
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.02334516361654484,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.02334516361654484
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.01575052628436337,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.01575052628436337
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.024127463462650156,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.024127463462650156
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594194,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594194
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.412484700122399,
"mc1_stderr": 0.017233299399571227,
"mc2": 0.596080564321232,
"mc2_stderr": 0.01451800985281567
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363696
},
"harness|gsm8k|5": {
"acc": 0.6573161485974223,
"acc_stderr": 0.013073030230827912
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ibranze/araproje_mmlu_tr_conf_mgpt_nearestscore_true_x | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 137404.0
num_examples: 250
download_size: 83864
dataset_size: 137404.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_tr_conf_mgpt_nearestscore_true_x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_martyn__solar-megamerge-dare-10.7b-v1 | ---
pretty_name: Evaluation run of martyn/solar-megamerge-dare-10.7b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [martyn/solar-megamerge-dare-10.7b-v1](https://huggingface.co/martyn/solar-megamerge-dare-10.7b-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_martyn__solar-megamerge-dare-10.7b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T12:28:22.950465](https://huggingface.co/datasets/open-llm-leaderboard/details_martyn__solar-megamerge-dare-10.7b-v1/blob/main/results_2024-01-04T12-28-22.950465.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6610634231695184,\n\
\ \"acc_stderr\": 0.031657358174671586,\n \"acc_norm\": 0.6635919799924697,\n\
\ \"acc_norm_stderr\": 0.03229437004691903,\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5433095073342544,\n\
\ \"mc2_stderr\": 0.015460055514713956\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6168941979522184,\n \"acc_stderr\": 0.014206472661672876,\n\
\ \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6608245369448317,\n\
\ \"acc_stderr\": 0.004724619193427587,\n \"acc_norm\": 0.8530173272256523,\n\
\ \"acc_norm_stderr\": 0.0035336498517284792\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130733,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130733\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.043902592653775614,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.043902592653775614\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568532,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568532\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950357,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603915,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.02466674491518721,\n \
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.02466674491518721\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465718,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465718\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643526,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643526\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.03021683101150878,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.03021683101150878\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924974,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903348,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903348\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\
\ \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n\
\ \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667874,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.023576881744005705,\n\
\ \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.023576881744005705\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4876140808344198,\n\
\ \"acc_stderr\": 0.012766317315473556,\n \"acc_norm\": 0.4876140808344198,\n\
\ \"acc_norm_stderr\": 0.012766317315473556\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.02725720260611494,\n\
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.02725720260611494\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7107843137254902,\n \"acc_stderr\": 0.018342529845275915,\n \
\ \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.018342529845275915\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904028,\n\
\ \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904028\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5433095073342544,\n\
\ \"mc2_stderr\": 0.015460055514713956\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825905\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5799848369977255,\n \
\ \"acc_stderr\": 0.013595121688520485\n }\n}\n```"
repo_url: https://huggingface.co/martyn/solar-megamerge-dare-10.7b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|arc:challenge|25_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|gsm8k|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hellaswag|10_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-28-22.950465.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T12-28-22.950465.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- '**/details_harness|winogrande|5_2024-01-04T12-28-22.950465.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T12-28-22.950465.parquet'
- config_name: results
data_files:
- split: 2024_01_04T12_28_22.950465
path:
- results_2024-01-04T12-28-22.950465.parquet
- split: latest
path:
- results_2024-01-04T12-28-22.950465.parquet
---
# Dataset Card for Evaluation run of martyn/solar-megamerge-dare-10.7b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [martyn/solar-megamerge-dare-10.7b-v1](https://huggingface.co/martyn/solar-megamerge-dare-10.7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_martyn__solar-megamerge-dare-10.7b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T12:28:22.950465](https://huggingface.co/datasets/open-llm-leaderboard/details_martyn__solar-megamerge-dare-10.7b-v1/blob/main/results_2024-01-04T12-28-22.950465.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6610634231695184,
"acc_stderr": 0.031657358174671586,
"acc_norm": 0.6635919799924697,
"acc_norm_stderr": 0.03229437004691903,
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.5433095073342544,
"mc2_stderr": 0.015460055514713956
},
"harness|arc:challenge|25": {
"acc": 0.6168941979522184,
"acc_stderr": 0.014206472661672876,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6608245369448317,
"acc_stderr": 0.004724619193427587,
"acc_norm": 0.8530173272256523,
"acc_norm_stderr": 0.0035336498517284792
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130733,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130733
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.043902592653775614,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.043902592653775614
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568532,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568532
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03011768892950357,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03011768892950357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603915,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.02466674491518721,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.02466674491518721
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465718,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465718
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643526,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643526
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150878,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150878
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924974,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903348,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903348
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667874,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.023576881744005705,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.023576881744005705
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4876140808344198,
"acc_stderr": 0.012766317315473556,
"acc_norm": 0.4876140808344198,
"acc_norm_stderr": 0.012766317315473556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.02725720260611494,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.02725720260611494
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.018342529845275915,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.018342529845275915
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904028,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904028
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.5433095073342544,
"mc2_stderr": 0.015460055514713956
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825905
},
"harness|gsm8k|5": {
"acc": 0.5799848369977255,
"acc_stderr": 0.013595121688520485
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
eanderson/squad-nb_v2_temp | ---
license: mit
---
|
coastalcph/pararel_patterns | ---
dataset_info:
features:
- name: relation
dtype: string
- name: query
dtype: string
- name: subject
dtype: string
- name: object
dtype: string
- name: template
dtype: string
- name: template_index
dtype: int64
- name: candidates
sequence: string
splits:
- name: train
num_bytes: 293951032
num_examples: 157656
download_size: 8219151
dataset_size: 293951032
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BangumiBase/cardcaptorsakuraclearcardhen | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Cardcaptor Sakura - Clear Card-hen
This is the image base of bangumi Cardcaptor Sakura - Clear Card-hen, we detected 46 characters, 5120 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1583 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 381 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 26 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 21 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 57 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 47 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 55 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 18 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 24 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 22 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 38 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 381 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 65 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 120 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 81 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 33 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 21 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 19 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 24 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 23 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 14 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 99 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 14 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 46 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 59 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 47 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 129 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 107 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 462 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 64 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 9 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 134 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 90 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 478 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 14 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 21 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 20 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 16 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 29 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 6 | [Download](39/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 40 | 16 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 6 | [Download](41/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 42 | 8 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 23 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 5 | [Download](44/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| noise | 165 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
open-llm-leaderboard/details_cyberagent__open-calm-large | ---
pretty_name: Evaluation run of cyberagent/open-calm-large
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cyberagent/open-calm-large](https://huggingface.co/cyberagent/open-calm-large)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cyberagent__open-calm-large\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T06:03:22.906817](https://huggingface.co/datasets/open-llm-leaderboard/details_cyberagent__open-calm-large/blob/main/results_2023-10-27T06-03-22.906817.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0030411073825503355,\n\
\ \"em_stderr\": 0.0005638896908753121,\n \"f1\": 0.0342187500000001,\n\
\ \"f1_stderr\": 0.001160719751306324,\n \"acc\": 0.25610125343097334,\n\
\ \"acc_stderr\": 0.007403477156790928\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0030411073825503355,\n \"em_stderr\": 0.0005638896908753121,\n\
\ \"f1\": 0.0342187500000001,\n \"f1_stderr\": 0.001160719751306324\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225266\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5114443567482242,\n \"acc_stderr\": 0.014048804199859329\n\
\ }\n}\n```"
repo_url: https://huggingface.co/cyberagent/open-calm-large
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|arc:challenge|25_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T06_03_22.906817
path:
- '**/details_harness|drop|3_2023-10-27T06-03-22.906817.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T06-03-22.906817.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T06_03_22.906817
path:
- '**/details_harness|gsm8k|5_2023-10-27T06-03-22.906817.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T06-03-22.906817.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hellaswag|10_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-09T15-42-03.677218.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-09T15-42-03.677218.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-09T15-42-03.677218.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T06_03_22.906817
path:
- '**/details_harness|winogrande|5_2023-10-27T06-03-22.906817.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T06-03-22.906817.parquet'
- config_name: results
data_files:
- split: 2023_09_09T15_42_03.677218
path:
- results_2023-09-09T15-42-03.677218.parquet
- split: 2023_10_27T06_03_22.906817
path:
- results_2023-10-27T06-03-22.906817.parquet
- split: latest
path:
- results_2023-10-27T06-03-22.906817.parquet
---
# Dataset Card for Evaluation run of cyberagent/open-calm-large
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cyberagent/open-calm-large
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cyberagent/open-calm-large](https://huggingface.co/cyberagent/open-calm-large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cyberagent__open-calm-large",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T06:03:22.906817](https://huggingface.co/datasets/open-llm-leaderboard/details_cyberagent__open-calm-large/blob/main/results_2023-10-27T06-03-22.906817.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753121,
"f1": 0.0342187500000001,
"f1_stderr": 0.001160719751306324,
"acc": 0.25610125343097334,
"acc_stderr": 0.007403477156790928
},
"harness|drop|3": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753121,
"f1": 0.0342187500000001,
"f1_stderr": 0.001160719751306324
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225266
},
"harness|winogrande|5": {
"acc": 0.5114443567482242,
"acc_stderr": 0.014048804199859329
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Muennighoff/flores200 | ---
annotations_creators:
- found
language_creators:
- expert-generated
license:
- cc-by-sa-4.0
language:
- ace
- acm
- acq
- aeb
- afr
- ajp
- aka
- als
- amh
- apc
- arb
- ars
- ary
- arz
- asm
- ast
- awa
- ayr
- azb
- azj
- bak
- bam
- ban
- bel
- bem
- ben
- bho
- bjn
- bod
- bos
- bug
- bul
- cat
- ceb
- ces
- cjk
- ckb
- crh
- cym
- dan
- deu
- dik
- dyu
- dzo
- ell
- eng
- epo
- est
- eus
- ewe
- fao
- fij
- fin
- fon
- fra
- fur
- fuv
- gaz
- gla
- gle
- glg
- grn
- guj
- hat
- hau
- heb
- hin
- hne
- hrv
- hun
- hye
- ibo
- ilo
- ind
- isl
- ita
- jav
- jpn
- kab
- kac
- kam
- kan
- kas
- kat
- kaz
- kbp
- kea
- khk
- khm
- kik
- kin
- kir
- kmb
- kmr
- knc
- kon
- kor
- lao
- lij
- lim
- lin
- lit
- lmo
- ltg
- ltz
- lua
- lug
- luo
- lus
- lvs
- mag
- mai
- mal
- mar
- min
- mkd
- mlt
- mni
- mos
- mri
- mya
- nld
- nno
- nob
- npi
- nso
- nus
- nya
- oci
- ory
- pag
- pan
- pap
- pbt
- pes
- plt
- pol
- por
- prs
- quy
- ron
- run
- rus
- sag
- san
- sat
- scn
- shn
- sin
- slk
- slv
- smo
- sna
- snd
- som
- sot
- spa
- srd
- srp
- ssw
- sun
- swe
- swh
- szl
- tam
- taq
- tat
- tel
- tgk
- tgl
- tha
- tir
- tpi
- tsn
- tso
- tuk
- tum
- tur
- twi
- tzm
- uig
- ukr
- umb
- urd
- uzn
- vec
- vie
- war
- wol
- xho
- ydd
- yor
- yue
- zho
- zsm
- zul
multilinguality:
- multilingual
- translation
size_categories:
- unknown
source_datasets:
- extended|flores
task_categories:
- text2text-generation
- translation
task_ids: []
paperswithcode_id: flores
pretty_name: flores200
tags:
- conditional-text-generation
---
# Dataset Card for Flores200
## Table of Contents
- [Dataset Card for Flores200](#dataset-card-for-flores200)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Home:** [Flores](https://github.com/facebookresearch/flores)
- **Repository:** [Github](https://github.com/facebookresearch/flores)
### Dataset Summary
FLORES is a benchmark dataset for machine translation between English and low-resource languages.
>The creation of FLORES200 doubles the existing language coverage of FLORES-101.
Given the nature of the new languages, which have less standardization and require
more specialized professional translations, the verification process became more complex.
This required modifications to the translation workflow. FLORES-200 has several languages
which were not translated from English. Specifically, several languages were translated
from Spanish, French, Russian and Modern Standard Arabic. Moreover, FLORES-200 also
includes two script alternatives for four languages. FLORES-200 consists of translations
from 842 distinct web articles, totaling 3001 sentences. These sentences are divided
into three splits: dev, devtest, and test (hidden). On average, sentences are approximately
21 words long.
**Disclaimer**: *The Flores200 dataset is hosted by the Facebook and licensed under the [Creative Commons Attribution-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-sa/4.0/).
### Supported Tasks and Leaderboards
#### Multilingual Machine Translation
Refer to the [Dynabench leaderboard](https://dynabench.org/flores/Flores%20MT%20Evaluation%20(FULL)) for additional details on model evaluation on FLORES-101 in the context of the WMT2021 shared task on [Large-Scale Multilingual Machine Translation](http://www.statmt.org/wmt21/large-scale-multilingual-translation-task.html). Flores 200 is an extention of this.
### Languages
The dataset contains parallel sentences for 200 languages, as mentioned in the original [Github](https://github.com/facebookresearch/flores/blob/master/README.md) page for the project. Languages are identified with the ISO 639-3 code (e.g. `eng`, `fra`, `rus`) plus an additional code describing the script (e.g., "eng_Latn", "ukr_Cyrl"). See [the webpage for code descriptions](https://github.com/facebookresearch/flores/blob/main/flores200/README.md).
Use the configuration `all` to access the full set of parallel sentences for all the available languages in a single command.
Use a hyphenated pairing to get two langauges in one datapoint (e.g., "eng_Latn-ukr_Cyrl" will provide sentences in the format below).
## Dataset Structure
### Data Instances
A sample from the `dev` split for the Ukrainian language (`ukr_Cyrl` config) is provided below. All configurations have the same structure, and all sentences are aligned across configurations and splits.
```python
{
'id': 1,
'sentence': 'У понеділок, науковці зі Школи медицини Стенфордського університету оголосили про винайдення нового діагностичного інструменту, що може сортувати клітини за їх видами: це малесенький друкований чіп, який можна виготовити за допомогою стандартних променевих принтерів десь по одному центу США за штуку.',
'URL': 'https://en.wikinews.org/wiki/Scientists_say_new_medical_diagnostic_chip_can_sort_cells_anywhere_with_an_inkjet',
'domain': 'wikinews',
'topic': 'health',
'has_image': 0,
'has_hyperlink': 0
}
```
When using a hyphenated pairing or using the `all` function, data will be presented as follows:
```python
{
'id': 1,
'URL': 'https://en.wikinews.org/wiki/Scientists_say_new_medical_diagnostic_chip_can_sort_cells_anywhere_with_an_inkjet',
'domain': 'wikinews',
'topic': 'health',
'has_image': 0,
'has_hyperlink': 0,
'sentence_eng_Latn': 'On Monday, scientists from the Stanford University School of Medicine announced the invention of a new diagnostic tool that can sort cells by type: a tiny printable chip that can be manufactured using standard inkjet printers for possibly about one U.S. cent each.',
'sentence_ukr_Cyrl': 'У понеділок, науковці зі Школи медицини Стенфордського університету оголосили про винайдення нового діагностичного інструменту, що може сортувати клітини за їх видами: це малесенький друкований чіп, який можна виготовити за допомогою стандартних променевих принтерів десь по одному центу США за штуку.'
}
```
The text is provided as-in the original dataset, without further preprocessing or tokenization.
### Data Fields
- `id`: Row number for the data entry, starting at 1.
- `sentence`: The full sentence in the specific language (may have _lang for pairings)
- `URL`: The URL for the English article from which the sentence was extracted.
- `domain`: The domain of the sentence.
- `topic`: The topic of the sentence.
- `has_image`: Whether the original article contains an image.
- `has_hyperlink`: Whether the sentence contains a hyperlink.
### Data Splits
| config| `dev`| `devtest`|
|-----------------:|-----:|---------:|
|all configurations| 997| 1012:|
### Dataset Creation
Please refer to the original article [No Language Left Behind: Scaling Human-Centered Machine Translation](https://arxiv.org/abs/2207.04672) for additional information on dataset creation.
## Additional Information
### Dataset Curators
See paper for details.
### Licensing Information
Licensed with Creative Commons Attribution Share Alike 4.0. License available [here](https://creativecommons.org/licenses/by-sa/4.0/).
### Citation Information
Please cite the authors if you use these corpora in your work:
```bibtex
@article{nllb2022,
author = {NLLB Team, Marta R. Costa-jussà, James Cross, Onur Çelebi, Maha Elbayad, Kenneth Heafield, Kevin Heffernan, Elahe Kalbassi, Janice Lam, Daniel Licht, Jean Maillard, Anna Sun, Skyler Wang, Guillaume Wenzek, Al Youngblood, Bapi Akula, Loic Barrault, Gabriel Mejia Gonzalez, Prangthip Hansanti, John Hoffman, Semarley Jarrett, Kaushik Ram Sadagopan, Dirk Rowe, Shannon Spruit, Chau Tran, Pierre Andrews, Necip Fazil Ayan, Shruti Bhosale, Sergey Edunov, Angela Fan, Cynthia Gao, Vedanuj Goswami, Francisco Guzmán, Philipp Koehn, Alexandre Mourachko, Christophe Ropers, Safiyyah Saleem, Holger Schwenk, Jeff Wang},
title = {No Language Left Behind: Scaling Human-Centered Machine Translation},
year = {2022}
}
```
|
Andyrasika/VQA-Dataset | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: image_id
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 700662
num_examples: 9974
- name: test
num_bytes: 174412
num_examples: 2494
download_size: 299109
dataset_size: 875074
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: mit
language:
- en
tags:
- VQA
pretty_name: 'VQA '
size_categories:
- 100K<n<1M
---
The dataset is available at: https://www.mpi-inf.mpg.de/departments/computer-vision-and-machine-learning/research/vision-and-language/visual-turing-challenge/

```
@INPROCEEDINGS{malinowski2014nips,
author = {Malinowski, Mateusz and Fritz, Mario},
title = {A Multi-World Approach to Question Answering about Real-World Scenes based on Uncertain Input},
booktitle = {Advances in Neural Information Processing Systems 27},
editor = {Z. Ghahramani and M. Welling and C. Cortes and N.D. Lawrence and K.Q. Weinberger},
pages = {1682--1690},
year = {2014},
publisher = {Curran Associates, Inc.},
url = {http://papers.nips.cc/paper/5411-a-multi-world-approach-to-question-answering-about-real-world-scenes-based-on-uncertain-input.pdf}
}
``` |
AdapterOcean/med_alpaca_standardized_cluster_36_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 153708
num_examples: 390
download_size: 73522
dataset_size: 153708
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_36_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Loke-60000/Christina | ---
license: unknown
language:
- en
pretty_name: Christina Dataset
size_categories:
- 10K<n<100K
tags:
- steins;gate
- steins gate
- Makise Kurisu
- Kurisu
- Christina
---
# Christina Dataset
## Description
The Christina Dataset is a collection of interactions used to train a chatbot inspired by the character Kurisu Makise from the popular "Steins;Gate" games. The dataset comprises textual conversations involving Kurisu's persona, designed to replicate the AI 'Amadeus' from "Steins;Gate 0".
## Access Information
As of the current status, the Christina Dataset is not publicly available, and access is limited to the developers working on the chatbot project. We appreciate your understanding and interest in our research.
## Contents
The dataset consists of user inputs and corresponding responses in a structured format. It includes a total of 30,032 interactions involving Kurisu's character, providing a diverse range of conversational patterns.
## Contact
For any inquiries related to the Kurisu Dataset or the chatbot project, please feel free to reach out to the project team at lokman@viktorchondria.com. |
adhok/mmm_info | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 461494
num_examples: 771
download_size: 100066
dataset_size: 461494
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jordyvl/rvl_cdip_easyocr | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- extended|iit_cdip
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
paperswithcode_id: rvl-cdip
pretty_name: RVL-CDIP-EasyOCR
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': letter
'1': form
'2': email
'3': handwritten
'4': advertisement
'5': scientific report
'6': scientific publication
'7': specification
'8': file folder
'9': news article
'10': budget
'11': invoice
'12': presentation
'13': questionnaire
'14': resume
'15': memo
- name: words
sequence: string
- name: boxes
sequence:
sequence: int32
---
# Dataset Card for RVL-CDIP
## Extension
The data loader provides support for loading easyOCR files together with the images
It is not included under '../data', yet is available upon request via email <firstname@contract.fit>.
## Table of Contents
- [Dataset Card for RVL-CDIP](#dataset-card-for-rvl-cdip)
- [Extension](#extension)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [The RVL-CDIP Dataset](https://www.cs.cmu.edu/~aharley/rvl-cdip/)
- **Repository:**
- **Paper:** [Evaluation of Deep Convolutional Nets for Document Image Classification and Retrieval](https://arxiv.org/abs/1502.07058)
- **Leaderboard:** [RVL-CDIP leaderboard](https://paperswithcode.com/dataset/rvl-cdip)
- **Point of Contact:** [Adam W. Harley](mailto:aharley@cmu.edu)
### Dataset Summary
The RVL-CDIP (Ryerson Vision Lab Complex Document Information Processing) dataset consists of 400,000 grayscale images in 16 classes, with 25,000 images per class. There are 320,000 training images, 40,000 validation images, and 40,000 test images. The images are sized so their largest dimension does not exceed 1000 pixels.
### Supported Tasks and Leaderboards
- `image-classification`: The goal of this task is to classify a given document into one of 16 classes representing document types (letter, form, etc.). The leaderboard for this task is available [here](https://paperswithcode.com/sota/document-image-classification-on-rvl-cdip).
### Languages
All the classes and documents use English as their primary language.
## Dataset Structure
### Data Instances
A sample from the training set is provided below :
```
{
'image': <PIL.TiffImagePlugin.TiffImageFile image mode=L size=754x1000 at 0x7F9A5E92CA90>,
'label': 15
}
```
### Data Fields
- `image`: A `PIL.Image.Image` object containing a document.
- `label`: an `int` classification label.
<details>
<summary>Class Label Mappings</summary>
```json
{
"0": "letter",
"1": "form",
"2": "email",
"3": "handwritten",
"4": "advertisement",
"5": "scientific report",
"6": "scientific publication",
"7": "specification",
"8": "file folder",
"9": "news article",
"10": "budget",
"11": "invoice",
"12": "presentation",
"13": "questionnaire",
"14": "resume",
"15": "memo"
}
```
</details>
### Data Splits
| |train|test|validation|
|----------|----:|----:|---------:|
|# of examples|320000|40000|40000|
The dataset was split in proportions similar to those of ImageNet.
- 320000 images were used for training,
- 40000 images for validation, and
- 40000 images for testing.
## Dataset Creation
### Curation Rationale
From the paper:
> This work makes available a new labelled subset of the IIT-CDIP collection, containing 400,000
document images across 16 categories, useful for training new CNNs for document analysis.
### Source Data
#### Initial Data Collection and Normalization
The same as in the IIT-CDIP collection.
#### Who are the source language producers?
The same as in the IIT-CDIP collection.
### Annotations
#### Annotation process
The same as in the IIT-CDIP collection.
#### Who are the annotators?
The same as in the IIT-CDIP collection.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
The dataset was curated by the authors - Adam W. Harley, Alex Ufkes, and Konstantinos G. Derpanis.
### Licensing Information
RVL-CDIP is a subset of IIT-CDIP, which came from the [Legacy Tobacco Document Library](https://www.industrydocuments.ucsf.edu/tobacco/), for which license information can be found [here](https://www.industrydocuments.ucsf.edu/help/copyright/).
### Citation Information
```bibtex
@inproceedings{harley2015icdar,
title = {Evaluation of Deep Convolutional Nets for Document Image Classification and Retrieval},
author = {Adam W Harley and Alex Ufkes and Konstantinos G Derpanis},
booktitle = {International Conference on Document Analysis and Recognition ({ICDAR})}},
year = {2015}
}
```
### Contributions
Thanks to [@dnaveenr](https://github.com/dnaveenr) for adding this dataset. |
sghirardelli/rgbd-objects-14-classes-uw | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ball
'1': banana
'2': bell_pepper
'3': binder
'4': bowl
'5': calculator
'6': camera
'7': cap
'8': cell_phone
'9': cereal_box
'10': coffee_mug
'11': comb
'12': dry_battery
splits:
- name: train
num_bytes: 255389761.944
num_examples: 10932
download_size: 254509063
dataset_size: 255389761.944
---
# Dataset Card for "rgbd-objects-14-classes-uw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-samsum-0c672345-10275361 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: google/pegasus-large
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: train
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-large
* Dataset: samsum
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ikadebi](https://huggingface.co/ikadebi) for evaluating this model. |
mpasila/ParallelFiction-Ja_En-100k-json | ---
license: apache-2.0
task_categories:
- translation
language:
- ja
- en
---
This is my conversion of [NilanE/ParallelFiction-Ja_En-100k](https://huggingface.co/datasets/NilanE/ParallelFiction-Ja_En-100k) into json which can be read by text-generation-webui when training a model.
# Original Dataset card
# Dataset details
Each entry in this dataset is a sentence-aligned Japanese web novel chapter and English fan translation.
The intended use-case is for document translation tasks.
# Dataset format
```json
{
'src' : 'JAPANESE CHAPTER'
'trg' : 'ENGLISH TRANSLATION'
'meta' : {
"source": 'SAME ACROSS ALL ENTRIES',
"series": 'NAME OF WEB NOVEL SERIES',
"missed_lines": 'NUMBER OF LINES THAT WERE AT THE SAME INDEX BUT NOT DETECTED AS BEING TRANSLATIONS OF EACH OTHER',
"inserted_lines_src": 'NUMBER OF LINES IN THE JAPANESE TEXT THAT DID NOT HAVE A MATCHING TRANSLATION BUT ARE BUFFERED BY TRANSLATED LINES',
"inserted_lines_trg": 'SAME AS ABOVE BUT FOR ENGLISH',
}
}
```
A high number of inserted lines is not necessarily a sign of a bad pair, as many translations concatenate or divide source chapters when publishing.
Instead, watch out for high numbers of missed lines or entries where the inserted line count is high for both source and target. |
open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1 | ---
pretty_name: Evaluation run of ICBU-NPU/FashionGPT-70B-V1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ICBU-NPU/FashionGPT-70B-V1](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T15:24:49.736716](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1/blob/main/results_2023-10-29T15-24-49.736716.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.0004566676462666993,\n \"f1\": 0.07125104865771813,\n\
\ \"f1_stderr\": 0.0014102826102321945,\n \"acc\": 0.5589478168926856,\n\
\ \"acc_stderr\": 0.011387742640607\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462666993,\n\
\ \"f1\": 0.07125104865771813,\n \"f1_stderr\": 0.0014102826102321945\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2812736921910538,\n \
\ \"acc_stderr\": 0.012384789310940239\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273764\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|arc:challenge|25_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T15_24_49.736716
path:
- '**/details_harness|drop|3_2023-10-29T15-24-49.736716.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T15-24-49.736716.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T15_24_49.736716
path:
- '**/details_harness|gsm8k|5_2023-10-29T15-24-49.736716.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T15-24-49.736716.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hellaswag|10_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T15_24_49.736716
path:
- '**/details_harness|winogrande|5_2023-10-29T15-24-49.736716.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T15-24-49.736716.parquet'
- config_name: results
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- results_2023-09-19T01-12-17.792946.parquet
- split: 2023_10_29T15_24_49.736716
path:
- results_2023-10-29T15-24-49.736716.parquet
- split: latest
path:
- results_2023-10-29T15-24-49.736716.parquet
---
# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ICBU-NPU/FashionGPT-70B-V1](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T15:24:49.736716](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1/blob/main/results_2023-10-29T15-24-49.736716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666993,
"f1": 0.07125104865771813,
"f1_stderr": 0.0014102826102321945,
"acc": 0.5589478168926856,
"acc_stderr": 0.011387742640607
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666993,
"f1": 0.07125104865771813,
"f1_stderr": 0.0014102826102321945
},
"harness|gsm8k|5": {
"acc": 0.2812736921910538,
"acc_stderr": 0.012384789310940239
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273764
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Asad321/irfan-junejo-tweerts333 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 42301
num_examples: 126
download_size: 14643
dataset_size: 42301
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "irfan-junejo-tweerts333"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nerfgun3/chibi_style | ---
language:
- en
tags:
- stable-diffusion
- text-to-image
license: creativeml-openrail-m
inference: false
---
# Chibi Style Embedding / Textual Inversion
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"drawn by chibi_style"```
Use (Chibi) tag beside the Embedding for best results
If it is to strong just add [] around it.
Trained until 6000 steps
Have fun :)
## Example Pictures
<table>
<tr>
<td><img src=https://i.imgur.com/rXHJyFQ.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/eocJJXg.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/8dA3EUO.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/mmChRb3.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/sooxpE5.png width=100% height=100%/></td>
</tr>
</table>
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
liuyanchen1015/VALUE_qqp_negative_concord | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 262996
num_examples: 1287
- name: test
num_bytes: 2530064
num_examples: 12560
- name: train
num_bytes: 2338570
num_examples: 11233
download_size: 3216982
dataset_size: 5131630
---
# Dataset Card for "VALUE_qqp_negative_concord"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
neurae/dnd_style_intents | ---
dataset_info:
features:
- name: examples
dtype: string
- name: label_names
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 9654988
num_examples: 130570
- name: test
num_bytes: 1208016
num_examples: 16330
- name: eval
num_bytes: 1203046
num_examples: 16321
download_size: 5759885
dataset_size: 12066050
task_categories:
- text-classification
language:
- en
size_categories:
- 100K<n<1M
tags:
- D&D
- intent
- classification
pretty_name: D&D Style Intents
license: apache-2.0
---
# Dataset Card for "dnd_style_intents"
This dataset was designed for intent classification module in dialogue system for game developers.
There are about 163K examples over 17 intents in dataset.
All intents belong to one of two group: intents for interaction with game mechanics and intents for more correctly dialogue understanding.
Data was generated artificially and augmented with masking and paraphrase model. All examples are in D&D style. |
huggingartists/king-krule | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/king-krule"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.153776 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/6effd9dd0951f41966e769504644a338.675x675x2.gif')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/king-krule">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">King Krule</div>
<a href="https://genius.com/artists/king-krule">
<div style="text-align: center; font-size: 14px;">@king-krule</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/king-krule).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/king-krule")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|111| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/king-krule")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
TheAIchemist13/gramvaani_preprocessed_hi_test | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1310718206.25
num_examples: 1030
download_size: 433338707
dataset_size: 1310718206.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gramvaani_preprocessed_hi_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arthurmluz/GPTextSum_data-xlsum_cstnews_1024_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 30182
num_examples: 20
download_size: 37156
dataset_size: 30182
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "GPTextSum_data-xlsumm_cstnews_1024_results"
rouge= {'rouge1': 0.42328596861870976, 'rouge2': 0.20509969476992213, 'rougeL': 0.32990453663237673, 'rougeLsum': 0.32990453663237673}
bert= {'precision': 0.7561025887727737, 'recall': 0.7690637379884719, 'f1': 0.7619699746370315} |
senthil3226w/autotrain-data-dfun-lk90-yhtx | ---
dataset_info:
features:
- name: Context
dtype: string
- name: Answers
dtype: string
- name: Length
dtype: int64
- name: Language
dtype: string
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 6008875
num_examples: 200
- name: validation
num_bytes: 6008875
num_examples: 200
download_size: 5303902
dataset_size: 12017750
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-dfun-lk90-yhtx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_169 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1246870720
num_examples: 242960
download_size: 1273560797
dataset_size: 1246870720
---
# Dataset Card for "chunk_169"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nli_tr | ---
annotations_creators:
- expert-generated
language_creators:
- machine-generated
language:
- tr
license:
- cc-by-3.0
- cc-by-4.0
- cc-by-sa-3.0
- mit
- other
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- extended|snli
- extended|multi_nli
task_categories:
- text-classification
task_ids:
- natural-language-inference
- semantic-similarity-scoring
- text-scoring
paperswithcode_id: nli-tr
pretty_name: Natural Language Inference in Turkish
license_details: Open Portion of the American National Corpus
dataset_info:
- config_name: snli_tr
features:
- name: idx
dtype: int32
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 71175743
num_examples: 550152
- name: validation
num_bytes: 1359639
num_examples: 10000
- name: test
num_bytes: 1355409
num_examples: 10000
download_size: 40328942
dataset_size: 73890791
- config_name: multinli_tr
features:
- name: idx
dtype: int32
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 75524150
num_examples: 392702
- name: validation_matched
num_bytes: 1908283
num_examples: 10000
- name: validation_mismatched
num_bytes: 2039392
num_examples: 10000
download_size: 75518512
dataset_size: 79471825
config_names:
- multinli_tr
- snli_tr
---
# Dataset Card for "nli_tr"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/boun-tabi/NLI-TR](https://github.com/boun-tabi/NLI-TR)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 115.85 MB
- **Size of the generated dataset:** 153.36 MB
- **Total amount of disk used:** 269.21 MB
### Dataset Summary
The Natural Language Inference in Turkish (NLI-TR) is a set of two large scale datasets that were obtained by translating the foundational NLI corpora (SNLI and MNLI) using Amazon Translate.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### multinli_tr
- **Size of downloaded dataset files:** 75.52 MB
- **Size of the generated dataset:** 79.47 MB
- **Total amount of disk used:** 154.99 MB
An example of 'validation_matched' looks as follows.
```
This example was too long and was cropped:
{
"hypothesis": "Mrinal Sen'in çalışmalarının çoğu Avrupa koleksiyonlarında bulunabilir.",
"idx": 7,
"label": 1,
"premise": "\"Kalküta, sanatsal yaratıcılığa dair herhangi bir iddiaya sahip olan tek diğer üretim merkezi gibi görünüyor, ama ironik bir şek..."
}
```
#### snli_tr
- **Size of downloaded dataset files:** 40.33 MB
- **Size of the generated dataset:** 73.89 MB
- **Total amount of disk used:** 114.22 MB
An example of 'train' looks as follows.
```
{
"hypothesis": "Yaşlı bir adam, kızının işten çıkmasını bekçiyken suyunu içer.",
"idx": 9,
"label": 1,
"premise": "Parlak renkli gömlek çalışanları arka planda gülümseme iken yaşlı bir adam bir kahve dükkanında küçük bir masada onun portakal suyu ile oturur."
}
```
### Data Fields
The data fields are the same among all splits.
#### multinli_tr
- `idx`: a `int32` feature.
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
#### snli_tr
- `idx`: a `int32` feature.
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
### Data Splits
#### multinli_tr
| |train |validation_matched|validation_mismatched|
|-----------|-----:|-----------------:|--------------------:|
|multinli_tr|392702| 10000| 10000|
#### snli_tr
| |train |validation|test |
|-------|-----:|---------:|----:|
|snli_tr|550152| 10000|10000|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{budur-etal-2020-data,
title = "Data and Representation for Turkish Natural Language Inference",
author = "Budur, Emrah and
"{O}zçelik, Rıza and
G"{u}ng"{o}r, Tunga",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
abstract = "Large annotated datasets in NLP are overwhelmingly in English. This is an obstacle to progress in other languages. Unfortunately, obtaining new annotated resources for each task in each language would be prohibitively expensive. At the same time, commercial machine translation systems are now robust. Can we leverage these systems to translate English-language datasets automatically? In this paper, we offer a positive response for natural language inference (NLI) in Turkish. We translated two large English NLI datasets into Turkish and had a team of experts validate their translation quality and fidelity to the original labels. Using these datasets, we address core issues of representation for Turkish NLI. We find that in-language embeddings are essential and that morphological parsing can be avoided where the training set is large. Finally, we show that models trained on our machine-translated datasets are successful on human-translated evaluation sets. We share all code, models, and data publicly.",
}
```
### Contributions
Thanks to [@e-budur](https://github.com/e-budur) for adding this dataset. |
atmallen/amazon_polarity_embeddings_random4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: content
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: embedding
sequence: float32
- name: title
dtype: string
splits:
- name: train
num_bytes: 7148364432
num_examples: 3600000
- name: test
num_bytes: 19940712
num_examples: 10000
download_size: 3912035793
dataset_size: 7168305144
---
# Dataset Card for "amazon_polarity_embeddings_random4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kuon_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kuon/クオン/久远 (Azur Lane)
This is the dataset of kuon/クオン/久远 (Azur Lane), containing 197 images and their tags.
The core tags of this character are `long_hair, animal_ears, black_hair, tail, ponytail, brown_eyes, yellow_eyes, hair_ornament, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 197 | 264.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuon_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 197 | 157.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuon_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 462 | 317.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuon_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 197 | 238.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuon_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 462 | 439.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuon_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kuon_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, ainu_clothes, solo, low-tied_long_hair, smile, obi, looking_at_viewer |
| 1 | 6 |  |  |  |  |  | 1girl, obi, simple_background, smile, solo, white_background, ainu_clothes, looking_at_viewer, low-tied_long_hair, full_body |
| 2 | 7 |  |  |  |  |  | 1girl, medium_breasts, solo, blue_hair, looking_at_viewer, smile, cleavage, nude, water, ass, wet |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | ainu_clothes | solo | low-tied_long_hair | smile | obi | looking_at_viewer | simple_background | white_background | full_body | medium_breasts | blue_hair | cleavage | nude | water | ass | wet |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:---------------------|:--------|:------|:--------------------|:--------------------|:-------------------|:------------|:-----------------|:------------|:-----------|:-------|:--------|:------|:------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | | X | | X | | | | X | X | X | X | X | X | X |
|
dmrau/trec_dl19 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 2194
num_examples: 43
- name: corpus
num_bytes: 2181810
num_examples: 5482
download_size: 1207481
dataset_size: 2184004
---
# Dataset Card for "trec_dl19"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Weyaxi__OpenHermes-2.5-neural-chat-v3-2-Slerp | ---
pretty_name: Evaluation run of Weyaxi/OpenHermes-2.5-neural-chat-v3-2-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/OpenHermes-2.5-neural-chat-v3-2-Slerp](https://huggingface.co/Weyaxi/OpenHermes-2.5-neural-chat-v3-2-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__OpenHermes-2.5-neural-chat-v3-2-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T18:04:51.228408](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__OpenHermes-2.5-neural-chat-v3-2-Slerp/blob/main/results_2023-12-09T18-04-51.228408.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.644055937606071,\n\
\ \"acc_stderr\": 0.032184807364406556,\n \"acc_norm\": 0.6454677507073991,\n\
\ \"acc_norm_stderr\": 0.03283460519387843,\n \"mc1\": 0.4504283965728274,\n\
\ \"mc1_stderr\": 0.017417264371967646,\n \"mc2\": 0.6104827225746667,\n\
\ \"mc2_stderr\": 0.014972794318436832\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.013975454122756557,\n\
\ \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729124\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6569408484365664,\n\
\ \"acc_stderr\": 0.0047376083401634,\n \"acc_norm\": 0.8542123083051185,\n\
\ \"acc_norm_stderr\": 0.0035217202839105555\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973138,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973138\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.012729785386598568,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.012729785386598568\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4504283965728274,\n\
\ \"mc1_stderr\": 0.017417264371967646,\n \"mc2\": 0.6104827225746667,\n\
\ \"mc2_stderr\": 0.014972794318436832\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625849\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6307808946171342,\n \
\ \"acc_stderr\": 0.013293019538066244\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/OpenHermes-2.5-neural-chat-v3-2-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|arc:challenge|25_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|gsm8k|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hellaswag|10_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-04-51.228408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T18-04-51.228408.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- '**/details_harness|winogrande|5_2023-12-09T18-04-51.228408.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T18-04-51.228408.parquet'
- config_name: results
data_files:
- split: 2023_12_09T18_04_51.228408
path:
- results_2023-12-09T18-04-51.228408.parquet
- split: latest
path:
- results_2023-12-09T18-04-51.228408.parquet
---
# Dataset Card for Evaluation run of Weyaxi/OpenHermes-2.5-neural-chat-v3-2-Slerp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/OpenHermes-2.5-neural-chat-v3-2-Slerp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/OpenHermes-2.5-neural-chat-v3-2-Slerp](https://huggingface.co/Weyaxi/OpenHermes-2.5-neural-chat-v3-2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__OpenHermes-2.5-neural-chat-v3-2-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T18:04:51.228408](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__OpenHermes-2.5-neural-chat-v3-2-Slerp/blob/main/results_2023-12-09T18-04-51.228408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.644055937606071,
"acc_stderr": 0.032184807364406556,
"acc_norm": 0.6454677507073991,
"acc_norm_stderr": 0.03283460519387843,
"mc1": 0.4504283965728274,
"mc1_stderr": 0.017417264371967646,
"mc2": 0.6104827225746667,
"mc2_stderr": 0.014972794318436832
},
"harness|arc:challenge|25": {
"acc": 0.6459044368600683,
"acc_stderr": 0.013975454122756557,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.6569408484365664,
"acc_stderr": 0.0047376083401634,
"acc_norm": 0.8542123083051185,
"acc_norm_stderr": 0.0035217202839105555
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973138,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.01638463841038082,
"acc_norm": 0.4,
"acc_norm_stderr": 0.01638463841038082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598568,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598568
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4504283965728274,
"mc1_stderr": 0.017417264371967646,
"mc2": 0.6104827225746667,
"mc2_stderr": 0.014972794318436832
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625849
},
"harness|gsm8k|5": {
"acc": 0.6307808946171342,
"acc_stderr": 0.013293019538066244
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sunglyul/stt_data_231114 | ---
license: apache-2.0
---
|
Qdrant/dbpedia-entities-openai3-text-embedding-3-large-1536-1M | ---
dataset_info:
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: text-embedding-3-large-1536-embedding
sequence: float64
splits:
- name: train
num_bytes: 12679725776
num_examples: 1000000
download_size: 9551862565
dataset_size: 12679725776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- feature-extraction
language:
- en
size_categories:
- 1M<n<10M
---
1M OpenAI Embeddings: text-embedding-3-large 1536 dimensions
- Created: February 2024.
- Text used for Embedding: title (string) + text (string)
- Embedding Model: OpenAI text-embedding-3-large
- This dataset was generated from the first 1M entries of https://huggingface.co/datasets/BeIR/dbpedia-entity, extracted by @KShivendu_ [here](https://huggingface.co/datasets/KShivendu/dbpedia-entities-openai-1M) |
collabora/multilingual-librispeech-webdataset | ---
license: cc-by-2.0
---
|
Haxirus/DDInter_Train_Test_Val_Parquets | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 392554890
num_examples: 1243666
- name: test
num_bytes: 21787965
num_examples: 69093
- name: validation
num_bytes: 21879990
num_examples: 69093
download_size: 148495183
dataset_size: 436222845
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
sysrobson/sde4scxdr4 | ---
license: mit
---
|
gowitheflowlab/multi-pooled | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
splits:
- name: train
num_bytes: 528983104.40889275
num_examples: 1963485
download_size: 290954453
dataset_size: 528983104.40889275
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "multi-pooled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andrewatef/test123 | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-futin__feed-sen_vi-b48d12-2175169951 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-3b
metrics: []
dataset_name: futin/feed
dataset_config: sen_vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-3b
* Dataset: futin/feed
* Config: sen_vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
HossainRabby/LAMINI | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2150284.5
num_examples: 1260
- name: test
num_bytes: 238920.5
num_examples: 140
download_size: 698665
dataset_size: 2389205.0
---
# Dataset Card for "LAMINI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Neuronovo__neuronovo-7B-v0.3 | ---
pretty_name: Evaluation run of Neuronovo/neuronovo-7B-v0.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Neuronovo/neuronovo-7B-v0.3](https://huggingface.co/Neuronovo/neuronovo-7B-v0.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Neuronovo__neuronovo-7B-v0.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T19:14:03.178334](https://huggingface.co/datasets/open-llm-leaderboard/details_Neuronovo__neuronovo-7B-v0.3/blob/main/results_2024-01-10T19-14-03.178334.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6550616143531793,\n\
\ \"acc_stderr\": 0.03198202339154783,\n \"acc_norm\": 0.6562134351788875,\n\
\ \"acc_norm_stderr\": 0.032625803170979364,\n \"mc1\": 0.576499388004896,\n\
\ \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.7134545855534601,\n\
\ \"mc2_stderr\": 0.014949989648989805\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.01327307786590759,\n\
\ \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635751\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.719577773351922,\n\
\ \"acc_stderr\": 0.004482874732237348,\n \"acc_norm\": 0.8825931089424417,\n\
\ \"acc_norm_stderr\": 0.0032124662717039057\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.04940635630605659,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.04940635630605659\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265026,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265026\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n\
\ \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n\
\ \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.576499388004896,\n\
\ \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.7134545855534601,\n\
\ \"mc2_stderr\": 0.014949989648989805\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510436\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6141015921152388,\n \
\ \"acc_stderr\": 0.01340907747131917\n }\n}\n```"
repo_url: https://huggingface.co/Neuronovo/neuronovo-7B-v0.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|arc:challenge|25_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|gsm8k|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hellaswag|10_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T19-14-03.178334.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T19-14-03.178334.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- '**/details_harness|winogrande|5_2024-01-10T19-14-03.178334.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T19-14-03.178334.parquet'
- config_name: results
data_files:
- split: 2024_01_10T19_14_03.178334
path:
- results_2024-01-10T19-14-03.178334.parquet
- split: latest
path:
- results_2024-01-10T19-14-03.178334.parquet
---
# Dataset Card for Evaluation run of Neuronovo/neuronovo-7B-v0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Neuronovo/neuronovo-7B-v0.3](https://huggingface.co/Neuronovo/neuronovo-7B-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Neuronovo__neuronovo-7B-v0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T19:14:03.178334](https://huggingface.co/datasets/open-llm-leaderboard/details_Neuronovo__neuronovo-7B-v0.3/blob/main/results_2024-01-10T19-14-03.178334.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6550616143531793,
"acc_stderr": 0.03198202339154783,
"acc_norm": 0.6562134351788875,
"acc_norm_stderr": 0.032625803170979364,
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.7134545855534601,
"mc2_stderr": 0.014949989648989805
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.01327307786590759,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635751
},
"harness|hellaswag|10": {
"acc": 0.719577773351922,
"acc_stderr": 0.004482874732237348,
"acc_norm": 0.8825931089424417,
"acc_norm_stderr": 0.0032124662717039057
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265026,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265026
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4569832402234637,
"acc_stderr": 0.01666049858050917,
"acc_norm": 0.4569832402234637,
"acc_norm_stderr": 0.01666049858050917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.7134545855534601,
"mc2_stderr": 0.014949989648989805
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510436
},
"harness|gsm8k|5": {
"acc": 0.6141015921152388,
"acc_stderr": 0.01340907747131917
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kike/fito_coco | ---
license: mit
---
|
sunhaozhepy/sst_roberta_keywords_embeddings | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: float32
- name: tokens
dtype: string
- name: tree
dtype: string
- name: keywords
dtype: string
- name: keywords_embeddings
sequence: float32
splits:
- name: train
num_bytes: 29316285
num_examples: 8544
- name: validation
num_bytes: 3780849
num_examples: 1101
- name: test
num_bytes: 7584117
num_examples: 2210
download_size: 46990512
dataset_size: 40681251
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
salmanshahid/omega-mm | ---
license: mit
---
|
open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11 | ---
pretty_name: Evaluation run of andysalerno/openchat-nectar-0.11
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [andysalerno/openchat-nectar-0.11](https://huggingface.co/andysalerno/openchat-nectar-0.11)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T17:37:46.856873](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11/blob/main/results_2024-01-21T17-37-46.856873.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6550217713416889,\n\
\ \"acc_stderr\": 0.031883973576992575,\n \"acc_norm\": 0.6556984270442959,\n\
\ \"acc_norm_stderr\": 0.032539393609809474,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5291674189872531,\n\
\ \"mc2_stderr\": 0.015420698178455278\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844461,\n\
\ \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.01382204792228351\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6379207329217288,\n\
\ \"acc_stderr\": 0.004796193584930072,\n \"acc_norm\": 0.8328022306313483,\n\
\ \"acc_norm_stderr\": 0.0037238973056454936\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"\
acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.03021683101150878,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.03021683101150878\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741624,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741624\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n\
\ \"acc_stderr\": 0.014614465821966348,\n \"acc_norm\": 0.2569832402234637,\n\
\ \"acc_norm_stderr\": 0.014614465821966348\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.02357688174400572,\n\
\ \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.02357688174400572\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4908735332464146,\n\
\ \"acc_stderr\": 0.01276810860164001,\n \"acc_norm\": 0.4908735332464146,\n\
\ \"acc_norm_stderr\": 0.01276810860164001\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377204,\n\
\ \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377204\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5291674189872531,\n\
\ \"mc2_stderr\": 0.015420698178455278\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6846095526914329,\n \
\ \"acc_stderr\": 0.012799353675801838\n }\n}\n```"
repo_url: https://huggingface.co/andysalerno/openchat-nectar-0.11
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|arc:challenge|25_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|gsm8k|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hellaswag|10_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T17-37-46.856873.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T17-37-46.856873.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- '**/details_harness|winogrande|5_2024-01-21T17-37-46.856873.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T17-37-46.856873.parquet'
- config_name: results
data_files:
- split: 2024_01_21T17_37_46.856873
path:
- results_2024-01-21T17-37-46.856873.parquet
- split: latest
path:
- results_2024-01-21T17-37-46.856873.parquet
---
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.11
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.11](https://huggingface.co/andysalerno/openchat-nectar-0.11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T17:37:46.856873](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11/blob/main/results_2024-01-21T17-37-46.856873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6550217713416889,
"acc_stderr": 0.031883973576992575,
"acc_norm": 0.6556984270442959,
"acc_norm_stderr": 0.032539393609809474,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5291674189872531,
"mc2_stderr": 0.015420698178455278
},
"harness|arc:challenge|25": {
"acc": 0.6279863481228669,
"acc_stderr": 0.014124597881844461,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.01382204792228351
},
"harness|hellaswag|10": {
"acc": 0.6379207329217288,
"acc_stderr": 0.004796193584930072,
"acc_norm": 0.8328022306313483,
"acc_norm_stderr": 0.0037238973056454936
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150878,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150878
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741624,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741624
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.014614465821966348,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.014614465821966348
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.02357688174400572,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.02357688174400572
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4908735332464146,
"acc_stderr": 0.01276810860164001,
"acc_norm": 0.4908735332464146,
"acc_norm_stderr": 0.01276810860164001
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.026917481224377204,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.026917481224377204
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5291674189872531,
"mc2_stderr": 0.015420698178455278
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.6846095526914329,
"acc_stderr": 0.012799353675801838
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_circulus__Llama-2-7b-orca-v1 | ---
pretty_name: Evaluation run of circulus/Llama-2-7b-orca-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [circulus/Llama-2-7b-orca-v1](https://huggingface.co/circulus/Llama-2-7b-orca-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_circulus__Llama-2-7b-orca-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T21:26:35.463636](https://huggingface.co/datasets/open-llm-leaderboard/details_circulus__Llama-2-7b-orca-v1/blob/main/results_2023-09-16T21-26-35.463636.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08557046979865772,\n\
\ \"em_stderr\": 0.0028646840549845006,\n \"f1\": 0.15811556208053656,\n\
\ \"f1_stderr\": 0.003126158993030364,\n \"acc\": 0.4151299715828343,\n\
\ \"acc_stderr\": 0.009762520250486784\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08557046979865772,\n \"em_stderr\": 0.0028646840549845006,\n\
\ \"f1\": 0.15811556208053656,\n \"f1_stderr\": 0.003126158993030364\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \
\ \"acc_stderr\": 0.007390654481108218\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n\
\ }\n}\n```"
repo_url: https://huggingface.co/circulus/Llama-2-7b-orca-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T21_26_35.463636
path:
- '**/details_harness|drop|3_2023-09-16T21-26-35.463636.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T21-26-35.463636.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T21_26_35.463636
path:
- '**/details_harness|gsm8k|5_2023-09-16T21-26-35.463636.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T21-26-35.463636.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T21_26_35.463636
path:
- '**/details_harness|winogrande|5_2023-09-16T21-26-35.463636.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T21-26-35.463636.parquet'
- config_name: results
data_files:
- split: 2023_09_16T21_26_35.463636
path:
- results_2023-09-16T21-26-35.463636.parquet
- split: latest
path:
- results_2023-09-16T21-26-35.463636.parquet
---
# Dataset Card for Evaluation run of circulus/Llama-2-7b-orca-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/circulus/Llama-2-7b-orca-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [circulus/Llama-2-7b-orca-v1](https://huggingface.co/circulus/Llama-2-7b-orca-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_circulus__Llama-2-7b-orca-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T21:26:35.463636](https://huggingface.co/datasets/open-llm-leaderboard/details_circulus__Llama-2-7b-orca-v1/blob/main/results_2023-09-16T21-26-35.463636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08557046979865772,
"em_stderr": 0.0028646840549845006,
"f1": 0.15811556208053656,
"f1_stderr": 0.003126158993030364,
"acc": 0.4151299715828343,
"acc_stderr": 0.009762520250486784
},
"harness|drop|3": {
"em": 0.08557046979865772,
"em_stderr": 0.0028646840549845006,
"f1": 0.15811556208053656,
"f1_stderr": 0.003126158993030364
},
"harness|gsm8k|5": {
"acc": 0.07808946171341925,
"acc_stderr": 0.007390654481108218
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.01213438601986535
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nasa-cisto-data-science-group/modis-lake-powell-toy-dataset | ---
license: apache-2.0
size_categories:
- n<1K
---
# MODIS Water Lake Powell Toy Dataset
### Dataset Summary
Tabular dataset comprised of MODIS surface reflectance bands along with calculated indices and a label (water/not-water)
## Dataset Structure
### Data Fields
- `water`: Label, water or not-water (binary)
- `sur_refl_b01_1`: MODIS surface reflection band 1 (-100, 16000)
- `sur_refl_b02_1`: MODIS surface reflection band 2 (-100, 16000)
- `sur_refl_b03_1`: MODIS surface reflection band 3 (-100, 16000)
- `sur_refl_b04_1`: MODIS surface reflection band 4 (-100, 16000)
- `sur_refl_b05_1`: MODIS surface reflection band 5 (-100, 16000)
- `sur_refl_b06_1`: MODIS surface reflection band 6 (-100, 16000)
- `sur_refl_b07_1`: MODIS surface reflection band 7 (-100, 16000)
- `ndvi`: Normalized differential vegetation index (-20000, 20000)
- `ndwi1`: Normalized differential water index 1 (-20000, 20000)
- `ndwi2`: Normalized differential water index 2 (-20000, 20000)
### Data Splits
Train and test split. Test is 200 rows, train is 800.
## Dataset Creation
## Source Data
[MODIS MOD44W](https://lpdaac.usgs.gov/products/mod44wv006/)
[MODIS MOD09GA](https://lpdaac.usgs.gov/products/mod09gav006/)
[MODIS MOD09GQ](https://lpdaac.usgs.gov/products/mod09gqv006/)
## Annotation process
Labels were created by using the MOD44W C6 product to designate pixels in MODIS surface reflectance products as land or water. |
ylacombe/jenny-tts-10k-tagged | ---
dataset_info:
features:
- name: file_name
dtype: string
- name: text
dtype: string
- name: transcription_normalised
dtype: string
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: snr
dtype: float64
- name: c50
dtype: float64
- name: speaking_rate
dtype: string
- name: phonemes
dtype: string
- name: speaker_id
dtype: int64
- name: gender
dtype: string
- name: pitch
dtype: string
- name: noise
dtype: string
- name: reverberation
dtype: string
- name: speech_monotony
dtype: string
- name: text_description
dtype: string
splits:
- name: train
num_bytes: 11156102
num_examples: 20978
download_size: 4644295
dataset_size: 11156102
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anan-2024/twitter_dataset_1713181926 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 65351
num_examples: 170
download_size: 42391
dataset_size: 65351
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tasksource/ruletaker | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: label
dtype: string
- name: config
dtype: string
splits:
- name: train
num_bytes: 252209259
num_examples: 480152
- name: dev
num_bytes: 39591713
num_examples: 75872
- name: test
num_bytes: 80649163
num_examples: 151911
download_size: 34172740
dataset_size: 372450135
license: apache-2.0
language:
- en
---
# Dataset Card for "ruletaker"
https://github.com/allenai/ruletaker
```
@inproceedings{ruletaker2020,
title = {Transformers as Soft Reasoners over Language},
author = {Clark, Peter and Tafjord, Oyvind and Richardson, Kyle},
booktitle = {Proceedings of the Twenty-Ninth International Joint Conference on
Artificial Intelligence, {IJCAI-20}},
publisher = {International Joint Conferences on Artificial Intelligence Organization},
editor = {Christian Bessiere},
pages = {3882--3890},
year = {2020},
month = {7},
note = {Main track},
doi = {10.24963/ijcai.2020/537},
url = {https://doi.org/10.24963/ijcai.2020/537},
}
``` |
Nexdata/Latin_American_Speaking_English_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Latin_American_Speaking_English_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1021?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
281 Latinos recorded in a relatively quiet environment in authentic English. The recorded script is designed by linguists and covers a wide range of topics including generic, interactive, on-board and home. The text is manually proofread with high accuracy. It matches with mainstream Android and Apple system phones.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1021?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Latin American English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
okasumov/miracle_ai | ---
license: gpl
---
|
THUDM/AgentInstruct | ---
configs:
- config_name: default
data_files:
- split: os
path: data/os-*
- split: db
path: data/db-*
- split: alfworld
path: data/alfworld-*
- split: webshop
path: data/webshop-*
- split: kg
path: data/kg-*
- split: mind2web
path: data/mind2web-*
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: loss
dtype: bool
- name: value
dtype: string
- name: id
dtype: string
splits:
- name: os
num_bytes: 660245
num_examples: 195
- name: db
num_bytes: 1436655
num_examples: 538
- name: alfworld
num_bytes: 1223363
num_examples: 336
- name: webshop
num_bytes: 1602648
num_examples: 351
- name: kg
num_bytes: 2960010
num_examples: 324
- name: mind2web
num_bytes: 159590
num_examples: 122
download_size: 1255385
dataset_size: 8042511
language:
- en
pretty_name: AgentInstruct
---
# AgentInstruct Dataset
<p align="center">
🤗 <a href="https://huggingface.co/THUDM/agentlm-70b" target="_blank">[Models]</a> • 💻 <a href="https://github.com/THUDM/AgentTuning" target="_blank">[Github Repo]</a> • 📌 <a href="https://THUDM.github.io/AgentTuning/" target="_blank">[Project Page]</a> • 📃 <a href="https://arxiv.org/abs/2310.12823" target="_blank">[Paper]</a>
</p>
**AgentInstruct** is a meticulously curated dataset featuring **1,866** high-quality interactions, designed to enhance AI agents across six diverse real-world tasks, leveraging innovative methods like **Task Derivation** and **Self-Instruct**.
- 🔍 **CoT** - Harness the power of [ReAct](https://react-lm.github.io/), offering detailed thought explanations for each action, ensuring an intricate understanding of the model's decision-making journey.
- 🌍 **Diversity** - Spanning 6 real-world scenarios, from Daily Household Routines to Database Operations, and their average turns range from 5 to 35.
- 🎯 **Precision** - Not all trajectories of GPT-4 are effective! Ours are rigorously filtered using strict rewards to ensure top-notch quality.
- ✅ **Assurance** - Rigorous checks to avoid data leakage, ensuring pristine dataset quality.
## Task Overview
| Task | # Filt. Traj. | Avg # Filt. Traj. Turns |
|---|---|---|
|ALFWorld|336|13.52|
|WebShop|351|3.68|
|Mind2Web|122|1.00|
|Knowledge Graph|324|6.04|
|Operating System|195|3.85|
|Database|538|2.06|
|**AgentInstruct**|1866|5.24|
AgentInstruct includes 1,866 trajectories from
6 agents tasks. "Traj." stands for interaction trajectory. "Filt. Traj."
stands for filtered trajectories.
## Models
**AgentLM** models are produced by mixed training on AgentInstruct dataset and ShareGPT dataset from Llama-2-chat models.
The models follow the conversation format of [Llama-2-chat](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), with system prompt fixed as
```
You are a helpful, respectful and honest assistant.
```
7B, 13B, and 70B models are available on Huggingface model hub.
|Model|Huggingface Repo|
|---|---|
|AgentLM-7B| [🤗Huggingface Repo](https://huggingface.co/THUDM/agentlm-7b) |
|AgentLM-13B| [🤗Huggingface Repo](https://huggingface.co/THUDM/agentlm-13b) |
|AgentLM-70B| [🤗Huggingface Repo](https://huggingface.co/THUDM/agentlm-70b) |
Check our [[Github Repo]](https://github.com/THUDM/AgentTuning) for details about **AgentTuning**.
## Citation
If you find our work useful, please consider citing AgentTuning:
```
@misc{zeng2023agenttuning,
title={AgentTuning: Enabling Generalized Agent Abilities for LLMs},
author={Aohan Zeng and Mingdao Liu and Rui Lu and Bowen Wang and Xiao Liu and Yuxiao Dong and Jie Tang},
year={2023},
eprint={2310.12823},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
ChunB1/phi-2-symbol-100k-en | ---
dataset_info:
features:
- name: task
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
- name: symbols
sequence: string
splits:
- name: train
num_bytes: 260358964.38112
num_examples: 99508
- name: validation
num_bytes: 10724
num_examples: 5
download_size: 132918722
dataset_size: 260369688.38112
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
BangumiBase/demichanwakataritai | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Demi-chan Wa Kataritai
This is the image base of bangumi Demi-chan wa Kataritai, we detected 16 characters, 1889 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 379 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 33 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 221 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 373 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 35 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 59 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 11 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 75 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 14 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 18 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 34 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 20 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 252 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 203 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 87 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 75 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
open-llm-leaderboard/details_alnrg2arg__test2 | ---
pretty_name: Evaluation run of alnrg2arg/test2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alnrg2arg/test2](https://huggingface.co/alnrg2arg/test2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__test2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-15T11:22:11.663514](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test2/blob/main/results_2024-01-15T11-22-11.663514.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24636436924076846,\n\
\ \"acc_stderr\": 0.03057531615216942,\n \"acc_norm\": 0.24707158644894944,\n\
\ \"acc_norm_stderr\": 0.031385477138922584,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662571,\n \"mc2\": 0.5013831681930769,\n\
\ \"mc2_stderr\": 0.017248638043307455\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23293515358361774,\n \"acc_stderr\": 0.012352507042617408,\n\
\ \"acc_norm\": 0.2721843003412969,\n \"acc_norm_stderr\": 0.013006600406423706\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2539334793865764,\n\
\ \"acc_stderr\": 0.0043437045123801,\n \"acc_norm\": 0.26249751045608444,\n\
\ \"acc_norm_stderr\": 0.004390923353200561\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\"\
: 0.03455473702325437,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\"\
: 0.03455473702325437\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \
\ \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343604,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.0336876293225943,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.0336876293225943\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2723404255319149,\n \"acc_stderr\": 0.029101290698386708,\n\
\ \"acc_norm\": 0.2723404255319149,\n \"acc_norm_stderr\": 0.029101290698386708\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.021851509822031715,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.021851509822031715\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n\
\ \"acc_stderr\": 0.024472243840895518,\n \"acc_norm\": 0.24516129032258063,\n\
\ \"acc_norm_stderr\": 0.024472243840895518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.03090379695211449,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.03090379695211449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\"\
: 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.03158415324047709,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.03158415324047709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.29292929292929293,\n \"acc_stderr\": 0.03242497958178817,\n \"\
acc_norm\": 0.29292929292929293,\n \"acc_norm_stderr\": 0.03242497958178817\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.02213908110397153,\n \
\ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.02213908110397153\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.0260671592222758,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0260671592222758\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380558,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380558\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24537037037037038,\n \"acc_stderr\": 0.02934666509437295,\n \"\
acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.02934666509437295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460295,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2242152466367713,\n\
\ \"acc_stderr\": 0.027991534258519527,\n \"acc_norm\": 0.2242152466367713,\n\
\ \"acc_norm_stderr\": 0.027991534258519527\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467764,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467764\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21794871794871795,\n\
\ \"acc_stderr\": 0.027046857630716667,\n \"acc_norm\": 0.21794871794871795,\n\
\ \"acc_norm_stderr\": 0.027046857630716667\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24648786717752236,\n\
\ \"acc_stderr\": 0.015411308769686938,\n \"acc_norm\": 0.24648786717752236,\n\
\ \"acc_norm_stderr\": 0.015411308769686938\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0230836585869842,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0230836585869842\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210746,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210746\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351277,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351277\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.2861736334405145,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.02399350170904212,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.02399350170904212\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843003,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843003\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24967405475880053,\n\
\ \"acc_stderr\": 0.011054538377832322,\n \"acc_norm\": 0.24967405475880053,\n\
\ \"acc_norm_stderr\": 0.011054538377832322\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.02488097151229426,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.02488097151229426\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.026537045312145284,\n\
\ \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.026537045312145284\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.033844291552331346,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.033844291552331346\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662571,\n \"mc2\": 0.5013831681930769,\n\
\ \"mc2_stderr\": 0.017248638043307455\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4988161010260458,\n \"acc_stderr\": 0.014052446290529022\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/alnrg2arg/test2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|arc:challenge|25_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|gsm8k|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hellaswag|10_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T11-22-11.663514.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T11-22-11.663514.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- '**/details_harness|winogrande|5_2024-01-15T11-22-11.663514.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-15T11-22-11.663514.parquet'
- config_name: results
data_files:
- split: 2024_01_15T11_22_11.663514
path:
- results_2024-01-15T11-22-11.663514.parquet
- split: latest
path:
- results_2024-01-15T11-22-11.663514.parquet
---
# Dataset Card for Evaluation run of alnrg2arg/test2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/test2](https://huggingface.co/alnrg2arg/test2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__test2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T11:22:11.663514](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test2/blob/main/results_2024-01-15T11-22-11.663514.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24636436924076846,
"acc_stderr": 0.03057531615216942,
"acc_norm": 0.24707158644894944,
"acc_norm_stderr": 0.031385477138922584,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662571,
"mc2": 0.5013831681930769,
"mc2_stderr": 0.017248638043307455
},
"harness|arc:challenge|25": {
"acc": 0.23293515358361774,
"acc_stderr": 0.012352507042617408,
"acc_norm": 0.2721843003412969,
"acc_norm_stderr": 0.013006600406423706
},
"harness|hellaswag|10": {
"acc": 0.2539334793865764,
"acc_stderr": 0.0043437045123801,
"acc_norm": 0.26249751045608444,
"acc_norm_stderr": 0.004390923353200561
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.03455473702325437,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03455473702325437
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.0336876293225943,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.0336876293225943
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2723404255319149,
"acc_stderr": 0.029101290698386708,
"acc_norm": 0.2723404255319149,
"acc_norm_stderr": 0.029101290698386708
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.021851509822031715,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.021851509822031715
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.024472243840895518,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.024472243840895518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.03090379695211449,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.03090379695211449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29292929292929293,
"acc_stderr": 0.03242497958178817,
"acc_norm": 0.29292929292929293,
"acc_norm_stderr": 0.03242497958178817
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02213908110397153,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02213908110397153
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0260671592222758,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0260671592222758
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380558,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380558
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24537037037037038,
"acc_stderr": 0.02934666509437295,
"acc_norm": 0.24537037037037038,
"acc_norm_stderr": 0.02934666509437295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2242152466367713,
"acc_stderr": 0.027991534258519527,
"acc_norm": 0.2242152466367713,
"acc_norm_stderr": 0.027991534258519527
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467764,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467764
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.027046857630716667,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.027046857630716667
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24648786717752236,
"acc_stderr": 0.015411308769686938,
"acc_norm": 0.24648786717752236,
"acc_norm_stderr": 0.015411308769686938
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0230836585869842,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0230836585869842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210746,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351277,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351277
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.02399350170904212,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.02399350170904212
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843003,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843003
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24967405475880053,
"acc_stderr": 0.011054538377832322,
"acc_norm": 0.24967405475880053,
"acc_norm_stderr": 0.011054538377832322
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.02488097151229426,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.02488097151229426
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.026537045312145284,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.026537045312145284
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.033844291552331346,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.033844291552331346
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662571,
"mc2": 0.5013831681930769,
"mc2_stderr": 0.017248638043307455
},
"harness|winogrande|5": {
"acc": 0.4988161010260458,
"acc_stderr": 0.014052446290529022
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bot-yaya/un_pdf_segmented9973 | ---
dataset_info:
features:
- name: record
dtype: string
- name: is_hard_linebreak
sequence: bool
- name: joined_text
dtype: string
splits:
- name: train
num_bytes: 436975487
num_examples: 9973
download_size: 203807245
dataset_size: 436975487
---
# Dataset Card for "un_pdf_segmented9973"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thanmay/arc-easy-ta | ---
dataset_info:
features:
- name: id
dtype: string
- name: answerKey
dtype: string
- name: itv2 ta
dtype: string
- name: question
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
splits:
- name: test
num_bytes: 3289750
num_examples: 2376
- name: validation
num_bytes: 787255
num_examples: 570
download_size: 1379065
dataset_size: 4077005
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
CrabfishAI/InstructQA-Highquality-16k | ---
language:
- en
size_categories:
- 10K<n<100K
license: unknown
task_categories:
- question-answering
---
# Dataset Card for Dataset Name
This dataset is a culmination of diverse sources, carefully curated with the intention of constructing a versatile and comprehensive dataset. We have amalgamated high-quality text from various datasets to form this unified dataset, designed to serve as a valuable and multifaceted resource for diverse purposes."
### Datasets used to create-
- [aditijha/instruct_v1_10k](https://huggingface.co/datasets/aditijha/instruct_v1_10k)
- [mosaicml/instruct-v3](https://huggingface.co/datasets/mosaicml/instruct-v3)
- [jondurbin/airoboros-2.2.1](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1)
## Uses
Can be used to fine-tune models. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.