id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
4eJIoBek/Old-GIFs-22k | 2023-10-01T07:55:46.000Z | [
"license:unknown",
"region:us"
] | 4eJIoBek | null | null | null | 0 | 0 | ---
license: unknown
---
|
crumb/syntax-laion-32k | 2023-10-01T07:59:14.000Z | [
"region:us"
] | crumb | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: url
dtype: string
- name: caption
dtype: string
- name: original_caption
dtype: string
splits:
- name: train
num_bytes: 10311225
num_examples: 32000
download_size: 6993643
dataset_size: 10311225
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "syntax-laion-32k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/ToxicQAtextFiltered | 2023-10-01T08:02:56.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
This is the TEXT filtered version of TOXICQA with all the semi-refusals (e.g. Remember, killing is bad)
This is a work in progress. |
ChanceFocus/flare-zh-afqmc | 2023-10-01T08:12:00.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | Entry not found |
ChanceFocus/flare-zh-corpus | 2023-10-01T08:13:38.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 1 | 0 | Entry not found |
ChanceFocus/flare-zh-stocka | 2023-10-01T08:14:10.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | Entry not found |
ChanceFocus/flare-zh-fineval | 2023-10-01T08:15:02.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | Entry not found |
ChanceFocus/flare-zh-fe | 2023-10-01T08:15:26.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | Entry not found |
ChanceFocus/flare-zh-nl | 2023-10-01T08:15:48.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | Entry not found |
ChanceFocus/flare-zh-nl2 | 2023-10-01T08:16:13.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | Entry not found |
ChanceFocus/flare-zh-nsp | 2023-10-01T08:17:16.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | Entry not found |
ChanceFocus/flare-zh-re | 2023-10-01T08:17:38.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | Entry not found |
ChanceFocus/flare-zh-stockb | 2023-10-01T08:18:09.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | Entry not found |
tazarov/test1 | 2023-10-01T08:22:16.000Z | [
"region:us"
] | tazarov | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: embedding
sequence: float32
- name: document
dtype: string
- name: metadata._id
dtype: string
- name: metadata.title
dtype: string
splits:
- name: train
num_bytes: 660267
num_examples: 100
download_size: 947796
dataset_size: 660267
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pphuc25/vlsp2023-test3 | 2023-10-01T08:58:53.000Z | [
"region:us"
] | pphuc25 | null | null | null | 0 | 0 | Entry not found |
nikchar/retrieval_verification_bm25_distilbert | 2023-10-01T08:57:25.000Z | [
"region:us"
] | nikchar | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: text
dtype: string
- name: retrieved_evidence_title
sequence: string
- name: retrieved_evidence_text
sequence: string
- name: labels
dtype: int64
- name: Retrieval_Success
dtype: bool
- name: Predicted_Labels
dtype: int64
- name: Predicted_Labels_Each_doc
sequence: int64
splits:
- name: train
num_bytes: 66031496
num_examples: 11073
download_size: 30811947
dataset_size: 66031496
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "retrieval_verification_bm25_distilbert"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davidazaraf/images | 2023-10-01T09:02:50.000Z | [
"region:us"
] | davidazaraf | null | null | null | 0 | 0 | Entry not found |
tazarov/ds2 | 2023-10-01T09:12:13.000Z | [
"region:us"
] | tazarov | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: embedding
sequence: float32
- name: document
dtype: string
- name: metadata._id
dtype: string
- name: metadata.title
dtype: string
splits:
- name: train
num_bytes: 660267
num_examples: 100
download_size: 947796
dataset_size: 660267
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ds2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/guiltycrown | 2023-10-01T10:45:18.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Guilty Crown
This is the image base of bangumi Guilty Crown, we detected 30 characters, 2278 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 497 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 38 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 25 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 132 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 94 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 47 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 65 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 15 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 19 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 24 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 61 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 55 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 18 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 106 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 88 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 103 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 38 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 34 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 26 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 22 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 73 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 61 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 84 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 16 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 52 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 8 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 31 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 6 | [Download](27/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 28 | 198 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 242 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
BangumiBase/seraphoftheend | 2023-10-01T11:14:14.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Seraph Of The End
This is the image base of bangumi Seraph of the End, we detected 51 characters, 3456 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 238 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 32 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 191 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 106 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 152 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 41 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 35 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 41 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 75 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 14 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 13 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 16 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 36 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 702 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 24 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 173 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 61 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 20 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 12 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 10 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 227 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 90 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 67 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 28 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 64 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 12 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 18 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 353 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 27 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 21 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 14 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 94 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 8 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 13 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 7 | [Download](34/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 35 | 15 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 7 | [Download](36/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 37 | 17 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 8 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 6 | [Download](39/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 40 | 45 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 10 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 31 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 13 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 28 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 36 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 16 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 6 | [Download](47/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 48 | 17 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 20 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 146 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
4eJIoBek/Old-audios-11k | 2023-10-01T09:34:30.000Z | [
"license:unknown",
"region:us"
] | 4eJIoBek | null | null | null | 0 | 0 | ---
license: unknown
---
unsorted audios in mod, wav or other old audio formats |
ichiro0128/kuroka | 2023-10-01T09:32:35.000Z | [
"region:us"
] | ichiro0128 | null | null | null | 0 | 0 | Entry not found |
vlsp-2023-vllm/exams | 2023-10-01T09:53:57.000Z | [
"region:us"
] | vlsp-2023-vllm | null | null | null | 0 | 0 | Entry not found |
NaNames/ljs_mode | 2023-10-01T09:46:53.000Z | [
"license:apache-2.0",
"region:us"
] | NaNames | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
nikchar/retrieval_verification_squeezebert | 2023-10-01T10:01:18.000Z | [
"region:us"
] | nikchar | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: text
dtype: string
- name: retrieved_evidence_title
sequence: string
- name: retrieved_evidence_text
sequence: string
- name: labels
dtype: int64
- name: Retrieval_Success
dtype: bool
- name: Predicted_Labels
dtype: int64
- name: Predicted_Labels_Each_doc
sequence: int64
splits:
- name: train
num_bytes: 73601741
num_examples: 11073
download_size: 34426520
dataset_size: 73601741
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "retrieval_verification_squeezebert"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nikchar/retrieval_verification_bm25_squeezebert | 2023-10-01T10:07:11.000Z | [
"region:us"
] | nikchar | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: text
dtype: string
- name: retrieved_evidence_title
sequence: string
- name: retrieved_evidence_text
sequence: string
- name: labels
dtype: int64
- name: Retrieval_Success
dtype: bool
- name: Predicted_Labels
dtype: int64
- name: Predicted_Labels_Each_doc
sequence: int64
splits:
- name: train
num_bytes: 66031496
num_examples: 11073
download_size: 30811993
dataset_size: 66031496
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "retrieval_verification_bm25_squeezebert"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anonymousmaharaj/dtf_comments_dataset | 2023-10-01T14:01:33.000Z | [
"size_categories:1M<n<10M",
"language:ru",
"license:apache-2.0",
"russian",
"dtf",
"comments",
"ru",
"region:us"
] | anonymousmaharaj | null | null | null | 0 | 0 | ---
license: apache-2.0
language:
- ru
tags:
- russian
- dtf
- comments
- ru
pretty_name: DTF Comments Dataset
size_categories:
- 1M<n<10M
---
# Good news everyone!
Here is a dataset with data from **dtf.ru**.
Collected ~4.6kk comments from ~500k users.
The last comment is dated 18 September 2023.
My post about this dataset - [Link](https://dtf.ru/u/169798-infernalnyy-gavnoed/2157548-ya-proanaliziroval-4-5-milliona-kommentariev-s-dtf-chtoby-tebe-ne-prishlos-etogo-delat-rezultat-ubil)
## Enjoy! |
tazarov/large-ds2 | 2023-10-01T10:15:25.000Z | [
"region:us"
] | tazarov | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: embedding
sequence: float32
- name: document
dtype: string
- name: metadata._id
dtype: string
- name: metadata.title
dtype: string
splits:
- name: train
num_bytes: 66035524
num_examples: 10000
download_size: 70392827
dataset_size: 66035524
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "large-ds2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zakay/kuririn | 2023-10-01T10:34:38.000Z | [
"license:openrail",
"region:us"
] | Zakay | null | null | null | 0 | 0 | ---
license: openrail
---
|
shawarmas/differentmesfianwords | 2023-10-04T10:28:23.000Z | [
"region:us"
] | shawarmas | null | null | null | 0 | 0 | Entry not found |
tazarov/dst123 | 2023-10-01T12:30:29.000Z | [
"region:us"
] | tazarov | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: embedding
sequence: float32
- name: document
dtype: string
- name: metadata._id
dtype: string
- name: metadata.title
dtype: string
splits:
- name: train
num_bytes: 132062224
num_examples: 20000
download_size: 111333452
dataset_size: 132062224
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dst123"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
erdaljericho/styles | 2023-10-01T11:27:49.000Z | [
"region:us"
] | erdaljericho | null | null | null | 0 | 0 | Entry not found |
qwtreue5u5trhdfgh/IllusionDiffusionGC | 2023-10-01T11:29:42.000Z | [
"license:mit",
"region:us"
] | qwtreue5u5trhdfgh | null | null | null | 0 | 0 | ---
license: mit
---
|
yeombora/sample | 2023-10-01T11:54:53.000Z | [
"license:mit",
"region:us"
] | yeombora | null | null | null | 0 | 0 | ---
license: mit
---
|
SilverSurge/trial_1 | 2023-10-01T11:55:26.000Z | [
"region:us"
] | SilverSurge | null | null | null | 0 | 0 | Entry not found |
Thouph/post_snapshot | 2023-10-01T12:14:09.000Z | [
"license:mit",
"region:us"
] | Thouph | null | null | null | 0 | 0 | ---
license: mit
---
|
mickylan2367/ZipfilePractice | 2023-10-10T07:47:09.000Z | [
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-sa-4.0",
"music",
"spectrogram",
"text",
"text2music",
"region:us"
] | mickylan2367 | null | null | null | 1 | 0 | ---
license: cc-by-sa-4.0
language:
- en
tags:
- music
- spectrogram
- text
- text2music
size_categories:
- 1K<n<10K
---
# Google/MusicCapsの音楽をスペクトログラムにしたデータセット
* 内容は<a href="https://huggingface.co/datasets/mickylan2367/GraySpectrogram">mickylan2367/GraySpectrogram</a>と同じです。
* ただ、このデータセットはデータ自体をzipファイルで作ったので、GraySpectrogramよりも(ちょっとだけ)ダウンロードが早いです。
## 基本情報
* sampling_rate: int = 44100
* 20秒のwavファイル -> 1600×800のpngファイルへ変換
* librosaの規格により、画像の縦軸:(0-10000?Hz), 画像の横軸:(0-40秒)
|
tazarov/dst1234 | 2023-10-01T13:59:44.000Z | [
"language:en",
"license:mit",
"region:us"
] | tazarov | null | null | null | 0 | 0 | ---
language:
- en
license: mit
dataset_info:
features:
- name: id
dtype: string
- name: embedding
sequence: float32
- name: document
dtype: string
- name: metadata._id
dtype: string
- name: metadata.title
dtype: string
splits:
- name: train
num_bytes: 1318281
num_examples: 200
download_size: 0
dataset_size: 1318281
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
x-chroma:
collection: name
metadata:
test: 1
---
# Dataset Card for "dst1234"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ichiro0128/seisokuone | 2023-10-01T12:49:58.000Z | [
"region:us"
] | ichiro0128 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_pds-eval | 2023-10-01T13:08:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of DevaMalla/llama_7b_qlora_pds-eval
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DevaMalla/llama_7b_qlora_pds-eval](https://huggingface.co/DevaMalla/llama_7b_qlora_pds-eval)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_pds-eval\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T13:07:32.777703](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_pds-eval/blob/main/results_2023-10-01T13-07-32.777703.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.33715617388535024,\n\
\ \"acc_stderr\": 0.03392083787082158,\n \"acc_norm\": 0.3409781847243754,\n\
\ \"acc_norm_stderr\": 0.03390672995206057,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4560380785373756,\n\
\ \"mc2_stderr\": 0.014670930905323707\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5085324232081911,\n \"acc_stderr\": 0.014609263165632186,\n\
\ \"acc_norm\": 0.5392491467576792,\n \"acc_norm_stderr\": 0.014566303676636581\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5865365465046803,\n\
\ \"acc_stderr\": 0.00491448053453371,\n \"acc_norm\": 0.7813184624576778,\n\
\ \"acc_norm_stderr\": 0.004125072816630342\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3886792452830189,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.3886792452830189,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.0398124054371786,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.0398124054371786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307811,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307811\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.037528339580033376,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.037528339580033376\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3032258064516129,\n\
\ \"acc_stderr\": 0.02614868593067174,\n \"acc_norm\": 0.3032258064516129,\n\
\ \"acc_norm_stderr\": 0.02614868593067174\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3939393939393939,\n \"acc_stderr\": 0.038154943086889305,\n\
\ \"acc_norm\": 0.3939393939393939,\n \"acc_norm_stderr\": 0.038154943086889305\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2878787878787879,\n \"acc_stderr\": 0.03225883512300993,\n \"\
acc_norm\": 0.2878787878787879,\n \"acc_norm_stderr\": 0.03225883512300993\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.034474782864143565,\n\
\ \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.034474782864143565\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.023119362758232277,\n\
\ \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.023119362758232277\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.030283995525884396,\n\
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3834862385321101,\n \"acc_stderr\": 0.020847156641915984,\n \"\
acc_norm\": 0.3834862385321101,\n \"acc_norm_stderr\": 0.020847156641915984\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19444444444444445,\n \"acc_stderr\": 0.026991454502036733,\n \"\
acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.026991454502036733\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.39705882352941174,\n \"acc_stderr\": 0.03434131164719129,\n \"\
acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.03434131164719129\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3628691983122363,\n \"acc_stderr\": 0.03129920825530213,\n \
\ \"acc_norm\": 0.3628691983122363,\n \"acc_norm_stderr\": 0.03129920825530213\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4080717488789238,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.4080717488789238,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3053435114503817,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.3053435114503817,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04643454608906274,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04643454608906274\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.34355828220858897,\n \"acc_stderr\": 0.03731133519673892,\n\
\ \"acc_norm\": 0.34355828220858897,\n \"acc_norm_stderr\": 0.03731133519673892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.04620284082280039,\n\
\ \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.04620284082280039\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.03255326307272487,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.03255326307272487\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.39846743295019155,\n\
\ \"acc_stderr\": 0.0175074386027774,\n \"acc_norm\": 0.39846743295019155,\n\
\ \"acc_norm_stderr\": 0.0175074386027774\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3699421965317919,\n \"acc_stderr\": 0.02599247202930639,\n\
\ \"acc_norm\": 0.3699421965317919,\n \"acc_norm_stderr\": 0.02599247202930639\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3104575163398693,\n \"acc_stderr\": 0.026493033225145894,\n\
\ \"acc_norm\": 0.3104575163398693,\n \"acc_norm_stderr\": 0.026493033225145894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632938,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632938\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.026869490744815247,\n\
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.026869490744815247\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503807,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503807\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28748370273794005,\n\
\ \"acc_stderr\": 0.011559337355708514,\n \"acc_norm\": 0.28748370273794005,\n\
\ \"acc_norm_stderr\": 0.011559337355708514\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33088235294117646,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.33088235294117646,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35294117647058826,\n \"acc_stderr\": 0.01933314202079707,\n \
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.01933314202079707\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.42727272727272725,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.42727272727272725,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2653061224489796,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.2653061224489796,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3482587064676617,\n\
\ \"acc_stderr\": 0.03368787466115459,\n \"acc_norm\": 0.3482587064676617,\n\
\ \"acc_norm_stderr\": 0.03368787466115459\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03811079669833531,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03811079669833531\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4560380785373756,\n\
\ \"mc2_stderr\": 0.014670930905323707\n }\n}\n```"
repo_url: https://huggingface.co/DevaMalla/llama_7b_qlora_pds-eval
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-07-32.777703.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- results_2023-10-01T13-07-32.777703.parquet
- split: latest
path:
- results_2023-10-01T13-07-32.777703.parquet
---
# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora_pds-eval
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DevaMalla/llama_7b_qlora_pds-eval
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_qlora_pds-eval](https://huggingface.co/DevaMalla/llama_7b_qlora_pds-eval) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_pds-eval",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T13:07:32.777703](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_pds-eval/blob/main/results_2023-10-01T13-07-32.777703.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.33715617388535024,
"acc_stderr": 0.03392083787082158,
"acc_norm": 0.3409781847243754,
"acc_norm_stderr": 0.03390672995206057,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4560380785373756,
"mc2_stderr": 0.014670930905323707
},
"harness|arc:challenge|25": {
"acc": 0.5085324232081911,
"acc_stderr": 0.014609263165632186,
"acc_norm": 0.5392491467576792,
"acc_norm_stderr": 0.014566303676636581
},
"harness|hellaswag|10": {
"acc": 0.5865365465046803,
"acc_stderr": 0.00491448053453371,
"acc_norm": 0.7813184624576778,
"acc_norm_stderr": 0.004125072816630342
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3886792452830189,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.3886792452830189,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.0398124054371786,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.0398124054371786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307811,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307811
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.037528339580033376,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.037528339580033376
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3032258064516129,
"acc_stderr": 0.02614868593067174,
"acc_norm": 0.3032258064516129,
"acc_norm_stderr": 0.02614868593067174
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937523,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937523
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3939393939393939,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.3939393939393939,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2878787878787879,
"acc_stderr": 0.03225883512300993,
"acc_norm": 0.2878787878787879,
"acc_norm_stderr": 0.03225883512300993
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.034474782864143565,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.034474782864143565
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.023119362758232277,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.023119362758232277
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3834862385321101,
"acc_stderr": 0.020847156641915984,
"acc_norm": 0.3834862385321101,
"acc_norm_stderr": 0.020847156641915984
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.026991454502036733,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.026991454502036733
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.03434131164719129,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.03434131164719129
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3628691983122363,
"acc_stderr": 0.03129920825530213,
"acc_norm": 0.3628691983122363,
"acc_norm_stderr": 0.03129920825530213
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4080717488789238,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.4080717488789238,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3053435114503817,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.3053435114503817,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04643454608906274,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04643454608906274
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.34355828220858897,
"acc_stderr": 0.03731133519673892,
"acc_norm": 0.34355828220858897,
"acc_norm_stderr": 0.03731133519673892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.04620284082280039,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.04620284082280039
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03255326307272487,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03255326307272487
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.39846743295019155,
"acc_stderr": 0.0175074386027774,
"acc_norm": 0.39846743295019155,
"acc_norm_stderr": 0.0175074386027774
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.02599247202930639,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.02599247202930639
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3104575163398693,
"acc_stderr": 0.026493033225145894,
"acc_norm": 0.3104575163398693,
"acc_norm_stderr": 0.026493033225145894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632938,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632938
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.026869490744815247,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.026869490744815247
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503807,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503807
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28748370273794005,
"acc_stderr": 0.011559337355708514,
"acc_norm": 0.28748370273794005,
"acc_norm_stderr": 0.011559337355708514
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33088235294117646,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.33088235294117646,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.01933314202079707,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.01933314202079707
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.42727272727272725,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.42727272727272725,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2653061224489796,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.2653061224489796,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3482587064676617,
"acc_stderr": 0.03368787466115459,
"acc_norm": 0.3482587064676617,
"acc_norm_stderr": 0.03368787466115459
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03811079669833531,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03811079669833531
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4560380785373756,
"mc2_stderr": 0.014670930905323707
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-2.2.1 | 2023-10-01T13:14:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-7b-2.2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-7b-2.2.1](https://huggingface.co/jondurbin/airoboros-l2-7b-2.2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-2.2.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T13:13:15.281257](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-2.2.1/blob/main/results_2023-10-01T13-13-15.281257.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47934843439824193,\n\
\ \"acc_stderr\": 0.03515972851914906,\n \"acc_norm\": 0.4831786165059527,\n\
\ \"acc_norm_stderr\": 0.03514364977856416,\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386847,\n \"mc2\": 0.44652308723072004,\n\
\ \"mc2_stderr\": 0.014483627563775884\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.515358361774744,\n \"acc_stderr\": 0.014604496129394913,\n\
\ \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6096395140410277,\n\
\ \"acc_stderr\": 0.004868341056566223,\n \"acc_norm\": 0.800637323242382,\n\
\ \"acc_norm_stderr\": 0.003987047047167317\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4981132075471698,\n \"acc_stderr\": 0.030772653642075664,\n\
\ \"acc_norm\": 0.4981132075471698,\n \"acc_norm_stderr\": 0.030772653642075664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.0238652068369726,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.0238652068369726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5258064516129032,\n \"acc_stderr\": 0.028406095057653326,\n \"\
acc_norm\": 0.5258064516129032,\n \"acc_norm_stderr\": 0.028406095057653326\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561953,\n \"\
acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561953\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056128,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056128\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n\
\ \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240634,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240634\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.032339434681820885,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.032339434681820885\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6385321100917432,\n \"acc_stderr\": 0.02059808200993737,\n \"\
acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.02059808200993737\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6029411764705882,\n \"acc_stderr\": 0.0343413116471913,\n \"acc_norm\"\
: 0.6029411764705882,\n \"acc_norm_stderr\": 0.0343413116471913\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955934,\n \"\
acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n\
\ \"acc_stderr\": 0.033188332862172806,\n \"acc_norm\": 0.5739910313901345,\n\
\ \"acc_norm_stderr\": 0.033188332862172806\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.030236389942173085,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.030236389942173085\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6577266922094508,\n\
\ \"acc_stderr\": 0.01696703176641362,\n \"acc_norm\": 0.6577266922094508,\n\
\ \"acc_norm_stderr\": 0.01696703176641362\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.026817718130348923,\n\
\ \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.026817718130348923\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49691358024691357,\n \"acc_stderr\": 0.027820214158594384,\n\
\ \"acc_norm\": 0.49691358024691357,\n \"acc_norm_stderr\": 0.027820214158594384\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3559322033898305,\n\
\ \"acc_stderr\": 0.01222864553727757,\n \"acc_norm\": 0.3559322033898305,\n\
\ \"acc_norm_stderr\": 0.01222864553727757\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4722222222222222,\n \"acc_stderr\": 0.020196594933541197,\n \
\ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.020196594933541197\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004128,\n\
\ \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004128\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386847,\n \"mc2\": 0.44652308723072004,\n\
\ \"mc2_stderr\": 0.014483627563775884\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-7b-2.2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-13-15.281257.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-13-15.281257.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-13-15.281257.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-13-15.281257.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_13_15.281257
path:
- results_2023-10-01T13-13-15.281257.parquet
- split: latest
path:
- results_2023-10-01T13-13-15.281257.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-7b-2.2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-7b-2.2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-7b-2.2.1](https://huggingface.co/jondurbin/airoboros-l2-7b-2.2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-2.2.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T13:13:15.281257](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-2.2.1/blob/main/results_2023-10-01T13-13-15.281257.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47934843439824193,
"acc_stderr": 0.03515972851914906,
"acc_norm": 0.4831786165059527,
"acc_norm_stderr": 0.03514364977856416,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386847,
"mc2": 0.44652308723072004,
"mc2_stderr": 0.014483627563775884
},
"harness|arc:challenge|25": {
"acc": 0.515358361774744,
"acc_stderr": 0.014604496129394913,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.014537144444284738
},
"harness|hellaswag|10": {
"acc": 0.6096395140410277,
"acc_stderr": 0.004868341056566223,
"acc_norm": 0.800637323242382,
"acc_norm_stderr": 0.003987047047167317
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4981132075471698,
"acc_stderr": 0.030772653642075664,
"acc_norm": 0.4981132075471698,
"acc_norm_stderr": 0.030772653642075664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.0238652068369726,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.0238652068369726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.028406095057653326,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.028406095057653326
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561953,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561953
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056128,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056128
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240634,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240634
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.032339434681820885,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.032339434681820885
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.02059808200993737,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.02059808200993737
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.0343413116471913,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.0343413116471913
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.033188332862172806,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.033188332862172806
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.030236389942173085,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.030236389942173085
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6577266922094508,
"acc_stderr": 0.01696703176641362,
"acc_norm": 0.6577266922094508,
"acc_norm_stderr": 0.01696703176641362
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.026817718130348923,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.026817718130348923
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49691358024691357,
"acc_stderr": 0.027820214158594384,
"acc_norm": 0.49691358024691357,
"acc_norm_stderr": 0.027820214158594384
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3559322033898305,
"acc_stderr": 0.01222864553727757,
"acc_norm": 0.3559322033898305,
"acc_norm_stderr": 0.01222864553727757
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.020196594933541197,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.020196594933541197
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5469387755102041,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.5469387755102041,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386847,
"mc2": 0.44652308723072004,
"mc2_stderr": 0.014483627563775884
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yagnikposhiya/CommonVoiceCorpusHindi15 | 2023-10-02T10:53:29.000Z | [
"language:hi",
"license:apache-2.0",
"region:us"
] | yagnikposhiya | null | null | null | 0 | 0 | ---
license: apache-2.0
language:
- hi
---
## CommonVoiceCorpusHindi15
#### Directory structure:
1. **assets** <br>
**a.** Download whole compressed dataset by clicking on the cv-corpus-15.0-2023-09-08-hi.tar.gz file. <br>
**b.** splitdata.py, python script contains code to split "clips" direcrtory in the original dataset. Because HuggingFace supports 10,000 files per directory but in the original dataset "clips" directory contains 14,000 files almost. So, "clips" directory is splitted into two directories "clips0" and "clips1". "clips0" contains 10,000 audio files exactly and "clips1" contains all remaining audio files.
2. **data** <br>
**a.** "clips0" contains 10,000 audio files<br>
**b.** "clips1" contains 4,000+ audio files (remaining from 14,000 audio files)
**c.** Remaining all are metadata files.
3. **Credit:** [Common Voice moz://a](https://commonvoice.mozilla.org/hi/datasets) |
open-llm-leaderboard/details_LTC-AI-Labs__Guanaco-Vicuna-7B-L2 | 2023-10-01T13:19:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of LTC-AI-Labs/Guanaco-Vicuna-7B-L2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LTC-AI-Labs/Guanaco-Vicuna-7B-L2](https://huggingface.co/LTC-AI-Labs/Guanaco-Vicuna-7B-L2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LTC-AI-Labs__Guanaco-Vicuna-7B-L2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T13:18:10.170951](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__Guanaco-Vicuna-7B-L2/blob/main/results_2023-10-01T13-18-10.170951.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4702719452109418,\n\
\ \"acc_stderr\": 0.03523008113672575,\n \"acc_norm\": 0.4742238812382433,\n\
\ \"acc_norm_stderr\": 0.03521548128308249,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42748996824542757,\n\
\ \"mc2_stderr\": 0.01435868175068892\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49573378839590443,\n \"acc_stderr\": 0.014610858923956952,\n\
\ \"acc_norm\": 0.5324232081911263,\n \"acc_norm_stderr\": 0.014580637569995418\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5924118701453893,\n\
\ \"acc_stderr\": 0.0049038158859832795,\n \"acc_norm\": 0.788886675960964,\n\
\ \"acc_norm_stderr\": 0.004072645874992221\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4716981132075472,\n \"acc_stderr\": 0.0307235352490061,\n\
\ \"acc_norm\": 0.4716981132075472,\n \"acc_norm_stderr\": 0.0307235352490061\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419034,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419034\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4870967741935484,\n\
\ \"acc_stderr\": 0.028434533152681848,\n \"acc_norm\": 0.4870967741935484,\n\
\ \"acc_norm_stderr\": 0.028434533152681848\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n\
\ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.038254602783800246,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.038254602783800246\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5454545454545454,\n \"acc_stderr\": 0.03547601494006937,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03547601494006937\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43846153846153846,\n \"acc_stderr\": 0.025158266016868564,\n\
\ \"acc_norm\": 0.43846153846153846,\n \"acc_norm_stderr\": 0.025158266016868564\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094527,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094527\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.655045871559633,\n \"acc_stderr\": 0.020380605405066955,\n \"\
acc_norm\": 0.655045871559633,\n \"acc_norm_stderr\": 0.020380605405066955\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046955,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046955\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5490196078431373,\n \"acc_stderr\": 0.03492406104163613,\n \"\
acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.03492406104163613\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6540084388185654,\n \"acc_stderr\": 0.03096481058878671,\n \
\ \"acc_norm\": 0.6540084388185654,\n \"acc_norm_stderr\": 0.03096481058878671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578757,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578757\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.039265223787088445,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.039265223787088445\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.04825729337356389,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.04825729337356389\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\
\ \"acc_stderr\": 0.029996951858349476,\n \"acc_norm\": 0.7008547008547008,\n\
\ \"acc_norm_stderr\": 0.029996951858349476\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.632183908045977,\n\
\ \"acc_stderr\": 0.01724382889184627,\n \"acc_norm\": 0.632183908045977,\n\
\ \"acc_norm_stderr\": 0.01724382889184627\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5065359477124183,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.5065359477124183,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.02764814959975147,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.02764814959975147\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4845679012345679,\n \"acc_stderr\": 0.0278074900442762,\n\
\ \"acc_norm\": 0.4845679012345679,\n \"acc_norm_stderr\": 0.0278074900442762\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35071707953063885,\n\
\ \"acc_stderr\": 0.012187773370741518,\n \"acc_norm\": 0.35071707953063885,\n\
\ \"acc_norm_stderr\": 0.012187773370741518\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46078431372549017,\n \"acc_stderr\": 0.020165523313907897,\n \
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.020165523313907897\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4489795918367347,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.4489795918367347,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.03320685889744324,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.03320685889744324\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42748996824542757,\n\
\ \"mc2_stderr\": 0.01435868175068892\n }\n}\n```"
repo_url: https://huggingface.co/LTC-AI-Labs/Guanaco-Vicuna-7B-L2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-18-10.170951.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-18-10.170951.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-18-10.170951.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-18-10.170951.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_18_10.170951
path:
- results_2023-10-01T13-18-10.170951.parquet
- split: latest
path:
- results_2023-10-01T13-18-10.170951.parquet
---
# Dataset Card for Evaluation run of LTC-AI-Labs/Guanaco-Vicuna-7B-L2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LTC-AI-Labs/Guanaco-Vicuna-7B-L2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LTC-AI-Labs/Guanaco-Vicuna-7B-L2](https://huggingface.co/LTC-AI-Labs/Guanaco-Vicuna-7B-L2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LTC-AI-Labs__Guanaco-Vicuna-7B-L2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T13:18:10.170951](https://huggingface.co/datasets/open-llm-leaderboard/details_LTC-AI-Labs__Guanaco-Vicuna-7B-L2/blob/main/results_2023-10-01T13-18-10.170951.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4702719452109418,
"acc_stderr": 0.03523008113672575,
"acc_norm": 0.4742238812382433,
"acc_norm_stderr": 0.03521548128308249,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.42748996824542757,
"mc2_stderr": 0.01435868175068892
},
"harness|arc:challenge|25": {
"acc": 0.49573378839590443,
"acc_stderr": 0.014610858923956952,
"acc_norm": 0.5324232081911263,
"acc_norm_stderr": 0.014580637569995418
},
"harness|hellaswag|10": {
"acc": 0.5924118701453893,
"acc_stderr": 0.0049038158859832795,
"acc_norm": 0.788886675960964,
"acc_norm_stderr": 0.004072645874992221
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4716981132075472,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.4716981132075472,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419034,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419034
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.028434533152681848,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.028434533152681848
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.038254602783800246,
"acc_norm": 0.6,
"acc_norm_stderr": 0.038254602783800246
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03547601494006937,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03547601494006937
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43846153846153846,
"acc_stderr": 0.025158266016868564,
"acc_norm": 0.43846153846153846,
"acc_norm_stderr": 0.025158266016868564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094527,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094527
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.655045871559633,
"acc_stderr": 0.020380605405066955,
"acc_norm": 0.655045871559633,
"acc_norm_stderr": 0.020380605405066955
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046955,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046955
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6540084388185654,
"acc_stderr": 0.03096481058878671,
"acc_norm": 0.6540084388185654,
"acc_norm_stderr": 0.03096481058878671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578757,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578757
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.039265223787088445,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.039265223787088445
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.04825729337356389,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.04825729337356389
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.029996951858349476,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.029996951858349476
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.632183908045977,
"acc_stderr": 0.01724382889184627,
"acc_norm": 0.632183908045977,
"acc_norm_stderr": 0.01724382889184627
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5065359477124183,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.5065359477124183,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.02764814959975147,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.02764814959975147
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4845679012345679,
"acc_stderr": 0.0278074900442762,
"acc_norm": 0.4845679012345679,
"acc_norm_stderr": 0.0278074900442762
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281278,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281278
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35071707953063885,
"acc_stderr": 0.012187773370741518,
"acc_norm": 0.35071707953063885,
"acc_norm_stderr": 0.012187773370741518
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.020165523313907897,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.020165523313907897
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4489795918367347,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.4489795918367347,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.03320685889744324,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.03320685889744324
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.42748996824542757,
"mc2_stderr": 0.01435868175068892
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_BELLE-2__BELLE-Llama2-13B-chat-0.4M | 2023-10-01T13:37:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of BELLE-2/BELLE-Llama2-13B-chat-0.4M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BELLE-2/BELLE-Llama2-13B-chat-0.4M](https://huggingface.co/BELLE-2/BELLE-Llama2-13B-chat-0.4M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BELLE-2__BELLE-Llama2-13B-chat-0.4M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T13:36:40.123057](https://huggingface.co/datasets/open-llm-leaderboard/details_BELLE-2__BELLE-Llama2-13B-chat-0.4M/blob/main/results_2023-10-01T13-36-40.123057.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5603649468972762,\n\
\ \"acc_stderr\": 0.03448003990087675,\n \"acc_norm\": 0.5646869028560217,\n\
\ \"acc_norm_stderr\": 0.03445826586041197,\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5085196510414306,\n\
\ \"mc2_stderr\": 0.015546325045597166\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.014523987638344083,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693026\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6201951802429795,\n\
\ \"acc_stderr\": 0.004843462545943502,\n \"acc_norm\": 0.8231428002389962,\n\
\ \"acc_norm_stderr\": 0.003807680331172903\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.03999309712777474,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.03999309712777474\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.03028500925900979,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.03028500925900979\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149135,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149135\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.026923446059302837,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.026923446059302837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.025294608023986472,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.025294608023986472\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131137,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131137\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.01841528635141641,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.01841528635141641\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658342,\n\
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.0246624968452098,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.0246624968452098\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\
\ \"acc_stderr\": 0.01538435228454394,\n \"acc_norm\": 0.7547892720306514,\n\
\ \"acc_norm_stderr\": 0.01538435228454394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3888268156424581,\n\
\ \"acc_stderr\": 0.01630389953079613,\n \"acc_norm\": 0.3888268156424581,\n\
\ \"acc_norm_stderr\": 0.01630389953079613\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063145,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063145\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581986,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581986\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.027272582849839792,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.027272582849839792\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3956975228161669,\n\
\ \"acc_stderr\": 0.012489290735449016,\n \"acc_norm\": 0.3956975228161669,\n\
\ \"acc_norm_stderr\": 0.012489290735449016\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468307,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5588235294117647,\n \"acc_stderr\": 0.02008736207670286,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.02008736207670286\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5085196510414306,\n\
\ \"mc2_stderr\": 0.015546325045597166\n }\n}\n```"
repo_url: https://huggingface.co/BELLE-2/BELLE-Llama2-13B-chat-0.4M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-36-40.123057.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-36-40.123057.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-36-40.123057.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-36-40.123057.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_36_40.123057
path:
- results_2023-10-01T13-36-40.123057.parquet
- split: latest
path:
- results_2023-10-01T13-36-40.123057.parquet
---
# Dataset Card for Evaluation run of BELLE-2/BELLE-Llama2-13B-chat-0.4M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/BELLE-2/BELLE-Llama2-13B-chat-0.4M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [BELLE-2/BELLE-Llama2-13B-chat-0.4M](https://huggingface.co/BELLE-2/BELLE-Llama2-13B-chat-0.4M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BELLE-2__BELLE-Llama2-13B-chat-0.4M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T13:36:40.123057](https://huggingface.co/datasets/open-llm-leaderboard/details_BELLE-2__BELLE-Llama2-13B-chat-0.4M/blob/main/results_2023-10-01T13-36-40.123057.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5603649468972762,
"acc_stderr": 0.03448003990087675,
"acc_norm": 0.5646869028560217,
"acc_norm_stderr": 0.03445826586041197,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5085196510414306,
"mc2_stderr": 0.015546325045597166
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.014523987638344083,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693026
},
"harness|hellaswag|10": {
"acc": 0.6201951802429795,
"acc_stderr": 0.004843462545943502,
"acc_norm": 0.8231428002389962,
"acc_norm_stderr": 0.003807680331172903
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.03999309712777474,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.03999309712777474
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.03028500925900979,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.03028500925900979
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149135,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149135
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302837,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131137,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131137
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.01841528635141641,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.01841528635141641
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.0246624968452098,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.0246624968452098
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.01538435228454394,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.01538435228454394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3888268156424581,
"acc_stderr": 0.01630389953079613,
"acc_norm": 0.3888268156424581,
"acc_norm_stderr": 0.01630389953079613
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063145,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063145
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581986,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581986
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.027272582849839792,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.027272582849839792
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3956975228161669,
"acc_stderr": 0.012489290735449016,
"acc_norm": 0.3956975228161669,
"acc_norm_stderr": 0.012489290735449016
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468307,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.02008736207670286,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.02008736207670286
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5085196510414306,
"mc2_stderr": 0.015546325045597166
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Sao10K__Chat-Stheno-L2-13B | 2023-10-01T13:49:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Chat-Stheno-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Chat-Stheno-L2-13B](https://huggingface.co/Sao10K/Chat-Stheno-L2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Chat-Stheno-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T13:47:56.878037](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Chat-Stheno-L2-13B/blob/main/results_2023-10-01T13-47-56.878037.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5489269210394119,\n\
\ \"acc_stderr\": 0.03447281764550283,\n \"acc_norm\": 0.5530285378883756,\n\
\ \"acc_norm_stderr\": 0.03445404007838818,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.43306667783613856,\n\
\ \"mc2_stderr\": 0.015282157778245296\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636583,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216386\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6128261302529376,\n\
\ \"acc_stderr\": 0.0048610845340870314,\n \"acc_norm\": 0.8095996813383788,\n\
\ \"acc_norm_stderr\": 0.003918145109742981\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115205,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115205\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.032081157507886836,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596444,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.02737987122994325,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.02737987122994325\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836557,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836557\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.029252823291803638,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.029252823291803638\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.02534967290683866,\n \
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.02534967290683866\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.02794045713622842,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.02794045713622842\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.032437180551374116,\n\
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.032437180551374116\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7412844036697248,\n \"acc_stderr\": 0.018776052319619627,\n \"\
acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.018776052319619627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700918,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700918\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395953,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395953\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208176,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208176\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063145,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063145\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271143,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.012441155326854924,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.012441155326854924\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087558,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087558\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.43306667783613856,\n\
\ \"mc2_stderr\": 0.015282157778245296\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Chat-Stheno-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-56.878037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-56.878037.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-47-56.878037.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-47-56.878037.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_47_56.878037
path:
- results_2023-10-01T13-47-56.878037.parquet
- split: latest
path:
- results_2023-10-01T13-47-56.878037.parquet
---
# Dataset Card for Evaluation run of Sao10K/Chat-Stheno-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Chat-Stheno-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Chat-Stheno-L2-13B](https://huggingface.co/Sao10K/Chat-Stheno-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Chat-Stheno-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T13:47:56.878037](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Chat-Stheno-L2-13B/blob/main/results_2023-10-01T13-47-56.878037.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5489269210394119,
"acc_stderr": 0.03447281764550283,
"acc_norm": 0.5530285378883756,
"acc_norm_stderr": 0.03445404007838818,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.43306667783613856,
"mc2_stderr": 0.015282157778245296
},
"harness|arc:challenge|25": {
"acc": 0.5392491467576792,
"acc_stderr": 0.014566303676636583,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.014401366641216386
},
"harness|hellaswag|10": {
"acc": 0.6128261302529376,
"acc_stderr": 0.0048610845340870314,
"acc_norm": 0.8095996813383788,
"acc_norm_stderr": 0.003918145109742981
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115205,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115205
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596444,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.02737987122994325,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.02737987122994325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836557,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836557
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.029252823291803638,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.029252823291803638
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.02534967290683866,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.02534967290683866
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622842,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622842
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.032437180551374116,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.032437180551374116
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7412844036697248,
"acc_stderr": 0.018776052319619627,
"acc_norm": 0.7412844036697248,
"acc_norm_stderr": 0.018776052319619627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700918,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700918
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395953,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395953
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208176,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208176
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063145,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063145
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271143,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.012441155326854924,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.012441155326854924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.020148939420415745,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.020148939420415745
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087558,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.43306667783613856,
"mc2_stderr": 0.015282157778245296
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-2.2.1 | 2023-10-01T13:49:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-13b-2.2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-13b-2.2.1](https://huggingface.co/jondurbin/airoboros-l2-13b-2.2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-2.2.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T13:47:59.401032](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-2.2.1/blob/main/results_2023-10-01T13-47-59.401032.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5659754260450957,\n\
\ \"acc_stderr\": 0.03432191327135597,\n \"acc_norm\": 0.5700640385662075,\n\
\ \"acc_norm_stderr\": 0.03429927465348362,\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.016614949385347032,\n \"mc2\": 0.4942114491685882,\n\
\ \"mc2_stderr\": 0.015369137513850233\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n\
\ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.01425856388051378\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6391157140011949,\n\
\ \"acc_stderr\": 0.004792755235823524,\n \"acc_norm\": 0.8376817367058355,\n\
\ \"acc_norm_stderr\": 0.0036798891253998164\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286648,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286648\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.0403299905396072,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.0403299905396072\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.02380952380952387,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.02380952380952387\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n\
\ \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.6903225806451613,\n\
\ \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164542,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398674,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398674\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.0261521986197268,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.0261521986197268\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3329608938547486,\n\
\ \"acc_stderr\": 0.015761716178397563,\n \"acc_norm\": 0.3329608938547486,\n\
\ \"acc_norm_stderr\": 0.015761716178397563\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.02778014120702335,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.02778014120702335\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087375,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087375\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886525,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886525\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.016614949385347032,\n \"mc2\": 0.4942114491685882,\n\
\ \"mc2_stderr\": 0.015369137513850233\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-13b-2.2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-59.401032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-47-59.401032.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-47-59.401032.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-47-59.401032.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_47_59.401032
path:
- results_2023-10-01T13-47-59.401032.parquet
- split: latest
path:
- results_2023-10-01T13-47-59.401032.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-2.2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-13b-2.2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-13b-2.2.1](https://huggingface.co/jondurbin/airoboros-l2-13b-2.2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-2.2.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T13:47:59.401032](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-2.2.1/blob/main/results_2023-10-01T13-47-59.401032.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5659754260450957,
"acc_stderr": 0.03432191327135597,
"acc_norm": 0.5700640385662075,
"acc_norm_stderr": 0.03429927465348362,
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347032,
"mc2": 0.4942114491685882,
"mc2_stderr": 0.015369137513850233
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558902,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.01425856388051378
},
"harness|hellaswag|10": {
"acc": 0.6391157140011949,
"acc_stderr": 0.004792755235823524,
"acc_norm": 0.8376817367058355,
"acc_norm_stderr": 0.0036798891253998164
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286648,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286648
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.0403299905396072,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.0403299905396072
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.02380952380952387,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.02380952380952387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164542,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398674,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398674
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.0261521986197268,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.0261521986197268
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3329608938547486,
"acc_stderr": 0.015761716178397563,
"acc_norm": 0.3329608938547486,
"acc_norm_stderr": 0.015761716178397563
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.02778014120702335,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.02778014120702335
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087375,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087375
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347032,
"mc2": 0.4942114491685882,
"mc2_stderr": 0.015369137513850233
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TinyPixel__testmodel-3 | 2023-10-01T13:51:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TinyPixel/testmodel-3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TinyPixel/testmodel-3](https://huggingface.co/TinyPixel/testmodel-3) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TinyPixel__testmodel-3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T13:50:05.522780](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__testmodel-3/blob/main/results_2023-10-01T13-50-05.522780.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46822735252346837,\n\
\ \"acc_stderr\": 0.035230234963000304,\n \"acc_norm\": 0.47225136680204177,\n\
\ \"acc_norm_stderr\": 0.03521574532116693,\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707687,\n \"mc2\": 0.3875189010720103,\n\
\ \"mc2_stderr\": 0.013537362497855546\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4931740614334471,\n \"acc_stderr\": 0.014610029151379813,\n\
\ \"acc_norm\": 0.5324232081911263,\n \"acc_norm_stderr\": 0.014580637569995421\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.589026090420235,\n\
\ \"acc_stderr\": 0.004910049928688087,\n \"acc_norm\": 0.7871937860983867,\n\
\ \"acc_norm_stderr\": 0.004084552641903664\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.037657466938651504,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.037657466938651504\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5032258064516129,\n\
\ \"acc_stderr\": 0.028443414226438316,\n \"acc_norm\": 0.5032258064516129,\n\
\ \"acc_norm_stderr\": 0.028443414226438316\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051451,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051451\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806586,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806586\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4797979797979798,\n \"acc_stderr\": 0.0355944356556392,\n \"acc_norm\"\
: 0.4797979797979798,\n \"acc_norm_stderr\": 0.0355944356556392\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.0330881859441575,\n \
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.0330881859441575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43846153846153846,\n \"acc_stderr\": 0.025158266016868564,\n\
\ \"acc_norm\": 0.43846153846153846,\n \"acc_norm_stderr\": 0.025158266016868564\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6330275229357798,\n \"acc_stderr\": 0.020664675659520525,\n \"\
acc_norm\": 0.6330275229357798,\n \"acc_norm_stderr\": 0.020664675659520525\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.03054674526495318,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03054674526495318\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5294117647058824,\n \"acc_stderr\": 0.03503235296367992,\n \"\
acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03503235296367992\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6075949367088608,\n \"acc_stderr\": 0.03178471874564729,\n \
\ \"acc_norm\": 0.6075949367088608,\n \"acc_norm_stderr\": 0.03178471874564729\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\
\ \"acc_stderr\": 0.029996951858349472,\n \"acc_norm\": 0.7008547008547008,\n\
\ \"acc_norm_stderr\": 0.029996951858349472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6462324393358876,\n\
\ \"acc_stderr\": 0.017098184708161906,\n \"acc_norm\": 0.6462324393358876,\n\
\ \"acc_norm_stderr\": 0.017098184708161906\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5057803468208093,\n \"acc_stderr\": 0.026917296179149116,\n\
\ \"acc_norm\": 0.5057803468208093,\n \"acc_norm_stderr\": 0.026917296179149116\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.48366013071895425,\n \"acc_stderr\": 0.028614624752805413,\n\
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.028614624752805413\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930477,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930477\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.02782074420373286,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.02782074420373286\n },\n\
\ \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3617021276595745,\n\
\ \"acc_stderr\": 0.028663820147199492,\n \"acc_norm\": 0.3617021276595745,\n\
\ \"acc_norm_stderr\": 0.028663820147199492\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.3617992177314211,\n \"acc_stderr\": 0.012272736233262931,\n\
\ \"acc_norm\": 0.3617992177314211,\n \"acc_norm_stderr\": 0.012272736233262931\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428188,\n \"\
acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428188\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4493464052287582,\n \"acc_stderr\": 0.020123766528027266,\n \
\ \"acc_norm\": 0.4493464052287582,\n \"acc_norm_stderr\": 0.020123766528027266\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.031976941187136725,\n\
\ \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.031976941187136725\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268813,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268813\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707687,\n \"mc2\": 0.3875189010720103,\n\
\ \"mc2_stderr\": 0.013537362497855546\n }\n}\n```"
repo_url: https://huggingface.co/TinyPixel/testmodel-3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-50-05.522780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-50-05.522780.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-50-05.522780.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-50-05.522780.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_50_05.522780
path:
- results_2023-10-01T13-50-05.522780.parquet
- split: latest
path:
- results_2023-10-01T13-50-05.522780.parquet
---
# Dataset Card for Evaluation run of TinyPixel/testmodel-3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TinyPixel/testmodel-3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TinyPixel/testmodel-3](https://huggingface.co/TinyPixel/testmodel-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TinyPixel__testmodel-3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T13:50:05.522780](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__testmodel-3/blob/main/results_2023-10-01T13-50-05.522780.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46822735252346837,
"acc_stderr": 0.035230234963000304,
"acc_norm": 0.47225136680204177,
"acc_norm_stderr": 0.03521574532116693,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707687,
"mc2": 0.3875189010720103,
"mc2_stderr": 0.013537362497855546
},
"harness|arc:challenge|25": {
"acc": 0.4931740614334471,
"acc_stderr": 0.014610029151379813,
"acc_norm": 0.5324232081911263,
"acc_norm_stderr": 0.014580637569995421
},
"harness|hellaswag|10": {
"acc": 0.589026090420235,
"acc_stderr": 0.004910049928688087,
"acc_norm": 0.7871937860983867,
"acc_norm_stderr": 0.004084552641903664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.037657466938651504,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.037657466938651504
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5032258064516129,
"acc_stderr": 0.028443414226438316,
"acc_norm": 0.5032258064516129,
"acc_norm_stderr": 0.028443414226438316
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051451,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051451
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806586,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806586
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.0355944356556392,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.0355944356556392
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43846153846153846,
"acc_stderr": 0.025158266016868564,
"acc_norm": 0.43846153846153846,
"acc_norm_stderr": 0.025158266016868564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6330275229357798,
"acc_stderr": 0.020664675659520525,
"acc_norm": 0.6330275229357798,
"acc_norm_stderr": 0.020664675659520525
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03054674526495318,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03054674526495318
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6075949367088608,
"acc_stderr": 0.03178471874564729,
"acc_norm": 0.6075949367088608,
"acc_norm_stderr": 0.03178471874564729
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.029996951858349472,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.029996951858349472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6462324393358876,
"acc_stderr": 0.017098184708161906,
"acc_norm": 0.6462324393358876,
"acc_norm_stderr": 0.017098184708161906
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5057803468208093,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.5057803468208093,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.028614624752805413,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.028614624752805413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930477,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930477
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5,
"acc_stderr": 0.02782074420373286,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02782074420373286
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3617992177314211,
"acc_stderr": 0.012272736233262931,
"acc_norm": 0.3617992177314211,
"acc_norm_stderr": 0.012272736233262931
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428188,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428188
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4493464052287582,
"acc_stderr": 0.020123766528027266,
"acc_norm": 0.4493464052287582,
"acc_norm_stderr": 0.020123766528027266
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4775510204081633,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.4775510204081633,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268813,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268813
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707687,
"mc2": 0.3875189010720103,
"mc2_stderr": 0.013537362497855546
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b-V2 | 2023-10-01T13:53:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of totally-not-an-llm/PuddleJumper-13b-V2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [totally-not-an-llm/PuddleJumper-13b-V2](https://huggingface.co/totally-not-an-llm/PuddleJumper-13b-V2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b-V2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T13:51:37.934031](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b-V2/blob/main/results_2023-10-01T13-51-37.934031.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5827252608276984,\n\
\ \"acc_stderr\": 0.03411844148436114,\n \"acc_norm\": 0.5866394108578599,\n\
\ \"acc_norm_stderr\": 0.034100580587612915,\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.016656997109125146,\n \"mc2\": 0.526604740797921,\n\
\ \"mc2_stderr\": 0.015948037885326335\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5443686006825939,\n \"acc_stderr\": 0.014553749939306864,\n\
\ \"acc_norm\": 0.5699658703071673,\n \"acc_norm_stderr\": 0.014467631559137996\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6052579167496515,\n\
\ \"acc_stderr\": 0.004877962644991875,\n \"acc_norm\": 0.8105954989046007,\n\
\ \"acc_norm_stderr\": 0.003910288117015165\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.02450877752102842,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.02450877752102842\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.027162537826948458,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.027162537826948458\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885416,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885416\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915331,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915331\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215638,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215638\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.031631458075523776,\n\
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.031631458075523776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787586,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787586\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.01504630184669181,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.01504630184669181\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895806,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895806\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.49385474860335193,\n\
\ \"acc_stderr\": 0.016721238483631412,\n \"acc_norm\": 0.49385474860335193,\n\
\ \"acc_norm_stderr\": 0.016721238483631412\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363944,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363944\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.03016191193076711,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.03016191193076711\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.576797385620915,\n \"acc_stderr\": 0.019987809769482064,\n \
\ \"acc_norm\": 0.576797385620915,\n \"acc_norm_stderr\": 0.019987809769482064\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.016656997109125146,\n \"mc2\": 0.526604740797921,\n\
\ \"mc2_stderr\": 0.015948037885326335\n }\n}\n```"
repo_url: https://huggingface.co/totally-not-an-llm/PuddleJumper-13b-V2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-51-37.934031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-51-37.934031.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-51-37.934031.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-51-37.934031.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_51_37.934031
path:
- results_2023-10-01T13-51-37.934031.parquet
- split: latest
path:
- results_2023-10-01T13-51-37.934031.parquet
---
# Dataset Card for Evaluation run of totally-not-an-llm/PuddleJumper-13b-V2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/totally-not-an-llm/PuddleJumper-13b-V2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [totally-not-an-llm/PuddleJumper-13b-V2](https://huggingface.co/totally-not-an-llm/PuddleJumper-13b-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b-V2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T13:51:37.934031](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b-V2/blob/main/results_2023-10-01T13-51-37.934031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5827252608276984,
"acc_stderr": 0.03411844148436114,
"acc_norm": 0.5866394108578599,
"acc_norm_stderr": 0.034100580587612915,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.016656997109125146,
"mc2": 0.526604740797921,
"mc2_stderr": 0.015948037885326335
},
"harness|arc:challenge|25": {
"acc": 0.5443686006825939,
"acc_stderr": 0.014553749939306864,
"acc_norm": 0.5699658703071673,
"acc_norm_stderr": 0.014467631559137996
},
"harness|hellaswag|10": {
"acc": 0.6052579167496515,
"acc_stderr": 0.004877962644991875,
"acc_norm": 0.8105954989046007,
"acc_norm_stderr": 0.003910288117015165
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.02450877752102842,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.02450877752102842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915331,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915331
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215638,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215638
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.031631458075523776,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.031631458075523776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787586,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787586
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.02537213967172293,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.02537213967172293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.01504630184669181,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.01504630184669181
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895806,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895806
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.49385474860335193,
"acc_stderr": 0.016721238483631412,
"acc_norm": 0.49385474860335193,
"acc_norm_stderr": 0.016721238483631412
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363944,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363944
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.03016191193076711,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.03016191193076711
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.576797385620915,
"acc_stderr": 0.019987809769482064,
"acc_norm": 0.576797385620915,
"acc_norm_stderr": 0.019987809769482064
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.016656997109125146,
"mc2": 0.526604740797921,
"mc2_stderr": 0.015948037885326335
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tazarov/dst12345 | 2023-10-01T15:00:38.000Z | [
"size_categories:n<1K",
"language:en",
"license:mit",
"region:us"
] | tazarov | null | null | null | 0 | 0 | ---
language: en
license: mit
size_categories:
- n<1K
pretty_name: Chroma export of collection 4421321
dataset_info:
features:
- name: id
dtype: string
- name: embedding
sequence: float32
- name: document
dtype: string
- name: metadata._id
dtype: string
- name: metadata.title
dtype: string
splits:
- name: train
num_bytes: 1320534
num_examples: 200
download_size: 1297705
dataset_size: 1320534
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
x-chroma:
description: Chroma Dataset for collection 4421321
collection: '4421321'
metadata:
hnsw:space: ip
test: 123
---
# Dataset Card for "dst12345"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)+ |
ZacharySkanHF/Ahoy-AI-Model | 2023-10-01T14:16:26.000Z | [
"region:us"
] | ZacharySkanHF | null | null | null | 0 | 0 | Entry not found |
AlignmentLab-AI/caption-creation | 2023-10-01T14:22:12.000Z | [
"region:us"
] | AlignmentLab-AI | null | null | null | 0 | 0 | Entry not found |
dhenypatungka/dheny-test | 2023-10-01T14:17:47.000Z | [
"region:us"
] | dhenypatungka | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o | 2023-10-01T14:26:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:24:56.870950](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o/blob/main/results_2023-10-01T14-24-56.870950.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5314690625239179,\n\
\ \"acc_stderr\": 0.03477238894479982,\n \"acc_norm\": 0.5357570689782581,\n\
\ \"acc_norm_stderr\": 0.03475274964710231,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.401595331656996,\n\
\ \"mc2_stderr\": 0.014052827328412865\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5460750853242321,\n \"acc_stderr\": 0.014549221105171867,\n\
\ \"acc_norm\": 0.590443686006826,\n \"acc_norm_stderr\": 0.01437035863247244\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.602867954590719,\n\
\ \"acc_stderr\": 0.0048830377589199675,\n \"acc_norm\": 0.8114917347142003,\n\
\ \"acc_norm_stderr\": 0.0039031816674663686\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.030503292013342596,\n\
\ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.030503292013342596\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.0240268463928735,\n \"acc_norm\"\
: 0.3201058201058201,\n \"acc_norm_stderr\": 0.0240268463928735\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278246,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278246\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.034524539038220406,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.034524539038220406\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756775,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756775\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736118,\n\
\ \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.031918633744784645,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.031918633744784645\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7155963302752294,\n \"acc_stderr\": 0.01934203658770258,\n \"\
acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.01934203658770258\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842538,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842538\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7279693486590039,\n\
\ \"acc_stderr\": 0.0159133674475005,\n \"acc_norm\": 0.7279693486590039,\n\
\ \"acc_norm_stderr\": 0.0159133674475005\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.026538189104705477,\n\
\ \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.026538189104705477\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2905027932960894,\n\
\ \"acc_stderr\": 0.015183844307206141,\n \"acc_norm\": 0.2905027932960894,\n\
\ \"acc_norm_stderr\": 0.015183844307206141\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387292,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387292\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.02760468902858198,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.02760468902858198\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.02743162372241501,\n\
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.02743162372241501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970472,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970472\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3917861799217731,\n\
\ \"acc_stderr\": 0.012467564418145111,\n \"acc_norm\": 0.3917861799217731,\n\
\ \"acc_norm_stderr\": 0.012467564418145111\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887188,\n \
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887188\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5510204081632653,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.5510204081632653,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.401595331656996,\n\
\ \"mc2_stderr\": 0.014052827328412865\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-24-56.870950.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-24-56.870950.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-24-56.870950.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-24-56.870950.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_24_56.870950
path:
- results_2023-10-01T14-24-56.870950.parquet
- split: latest
path:
- results_2023-10-01T14-24-56.870950.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:24:56.870950](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o/blob/main/results_2023-10-01T14-24-56.870950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5314690625239179,
"acc_stderr": 0.03477238894479982,
"acc_norm": 0.5357570689782581,
"acc_norm_stderr": 0.03475274964710231,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.401595331656996,
"mc2_stderr": 0.014052827328412865
},
"harness|arc:challenge|25": {
"acc": 0.5460750853242321,
"acc_stderr": 0.014549221105171867,
"acc_norm": 0.590443686006826,
"acc_norm_stderr": 0.01437035863247244
},
"harness|hellaswag|10": {
"acc": 0.602867954590719,
"acc_stderr": 0.0048830377589199675,
"acc_norm": 0.8114917347142003,
"acc_norm_stderr": 0.0039031816674663686
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5660377358490566,
"acc_stderr": 0.030503292013342596,
"acc_norm": 0.5660377358490566,
"acc_norm_stderr": 0.030503292013342596
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.0240268463928735,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.0240268463928735
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278246,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278246
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.034524539038220406,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.034524539038220406
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756775,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756775
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47692307692307695,
"acc_stderr": 0.025323990861736118,
"acc_norm": 0.47692307692307695,
"acc_norm_stderr": 0.025323990861736118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.031918633744784645,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.031918633744784645
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7155963302752294,
"acc_stderr": 0.01934203658770258,
"acc_norm": 0.7155963302752294,
"acc_norm_stderr": 0.01934203658770258
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842538,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6319018404907976,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.6319018404907976,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729245,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729245
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7279693486590039,
"acc_stderr": 0.0159133674475005,
"acc_norm": 0.7279693486590039,
"acc_norm_stderr": 0.0159133674475005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.026538189104705477,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.026538189104705477
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2905027932960894,
"acc_stderr": 0.015183844307206141,
"acc_norm": 0.2905027932960894,
"acc_norm_stderr": 0.015183844307206141
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387292,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387292
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.02760468902858198,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.02760468902858198
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.02743162372241501,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.02743162372241501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.02927553215970472,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.02927553215970472
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3917861799217731,
"acc_stderr": 0.012467564418145111,
"acc_norm": 0.3917861799217731,
"acc_norm_stderr": 0.012467564418145111
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.020102583895887188,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.020102583895887188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5510204081632653,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.5510204081632653,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979033,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.401595331656996,
"mc2_stderr": 0.014052827328412865
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down | 2023-10-01T14:30:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:29:00.192136](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down/blob/main/results_2023-10-01T14-29-00.192136.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5374581948634449,\n\
\ \"acc_stderr\": 0.034676980627825364,\n \"acc_norm\": 0.5415297361439858,\n\
\ \"acc_norm_stderr\": 0.034658104554775776,\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.01552856663708729,\n \"mc2\": 0.3922571495872073,\n\
\ \"mc2_stderr\": 0.014138691335704716\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294324,\n\
\ \"acc_norm\": 0.5639931740614335,\n \"acc_norm_stderr\": 0.014491225699230916\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6140211113324039,\n\
\ \"acc_stderr\": 0.004858306877874621,\n \"acc_norm\": 0.819259111730731,\n\
\ \"acc_norm_stderr\": 0.00384016922401227\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983053,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983053\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n\
\ \"acc_stderr\": 0.027666182075539652,\n \"acc_norm\": 0.6161290322580645,\n\
\ \"acc_norm_stderr\": 0.027666182075539652\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070643,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070643\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.025310639254933893,\n\
\ \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.025310639254933893\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.032422250271150074,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.032422250271150074\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.037466683254700206,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.037466683254700206\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.02742100729539291,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.02742100729539291\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7445721583652618,\n\
\ \"acc_stderr\": 0.015594955384455765,\n \"acc_norm\": 0.7445721583652618,\n\
\ \"acc_norm_stderr\": 0.015594955384455765\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.02626167760780665,\n\
\ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.02626167760780665\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095273,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095273\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617156,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.02736807824397162,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.02736807824397162\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39895697522816165,\n\
\ \"acc_stderr\": 0.012506757655293679,\n \"acc_norm\": 0.39895697522816165,\n\
\ \"acc_norm_stderr\": 0.012506757655293679\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.5392156862745098,\n \"acc_stderr\": 0.020165523313907904,\n \"\
acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.020165523313907904\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5306122448979592,\n \"acc_stderr\": 0.031949171367580624,\n\
\ \"acc_norm\": 0.5306122448979592,\n \"acc_norm_stderr\": 0.031949171367580624\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.01552856663708729,\n \"mc2\": 0.3922571495872073,\n\
\ \"mc2_stderr\": 0.014138691335704716\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-29-00.192136.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- results_2023-10-01T14-29-00.192136.parquet
- split: latest
path:
- results_2023-10-01T14-29-00.192136.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:29:00.192136](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down/blob/main/results_2023-10-01T14-29-00.192136.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5374581948634449,
"acc_stderr": 0.034676980627825364,
"acc_norm": 0.5415297361439858,
"acc_norm_stderr": 0.034658104554775776,
"mc1": 0.2692778457772338,
"mc1_stderr": 0.01552856663708729,
"mc2": 0.3922571495872073,
"mc2_stderr": 0.014138691335704716
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294324,
"acc_norm": 0.5639931740614335,
"acc_norm_stderr": 0.014491225699230916
},
"harness|hellaswag|10": {
"acc": 0.6140211113324039,
"acc_stderr": 0.004858306877874621,
"acc_norm": 0.819259111730731,
"acc_norm_stderr": 0.00384016922401227
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983053,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983053
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.027666182075539652,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.027666182075539652
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070643,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070643
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4717948717948718,
"acc_stderr": 0.025310639254933893,
"acc_norm": 0.4717948717948718,
"acc_norm_stderr": 0.025310639254933893
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.032422250271150074,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.032422250271150074
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.037466683254700206,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.037466683254700206
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.02742100729539291,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.02742100729539291
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7445721583652618,
"acc_stderr": 0.015594955384455765,
"acc_norm": 0.7445721583652618,
"acc_norm_stderr": 0.015594955384455765
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.02626167760780665,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.02626167760780665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095273,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095273
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.02736807824397162,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.02736807824397162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39895697522816165,
"acc_stderr": 0.012506757655293679,
"acc_norm": 0.39895697522816165,
"acc_norm_stderr": 0.012506757655293679
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.020165523313907904,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.020165523313907904
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5306122448979592,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.5306122448979592,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2692778457772338,
"mc1_stderr": 0.01552856663708729,
"mc2": 0.3922571495872073,
"mc2_stderr": 0.014138691335704716
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o | 2023-10-01T14:34:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:32:44.417888](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o/blob/main/results_2023-10-01T14-32-44.417888.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5507671171020679,\n\
\ \"acc_stderr\": 0.03452836374060718,\n \"acc_norm\": 0.5551144278998122,\n\
\ \"acc_norm_stderr\": 0.03450922754729939,\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.01543821111952251,\n \"mc2\": 0.40124945157279795,\n\
\ \"mc2_stderr\": 0.013978095978126583\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.514505119453925,\n \"acc_stderr\": 0.014605241081370053,\n\
\ \"acc_norm\": 0.560580204778157,\n \"acc_norm_stderr\": 0.014503747823580125\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6084445329615614,\n\
\ \"acc_stderr\": 0.00487100593940747,\n \"acc_norm\": 0.8188607847042422,\n\
\ \"acc_norm_stderr\": 0.0038434637920379106\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278006,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278006\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842507,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842507\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871137,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871137\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.027327548447957536,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.027327548447957536\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.03376458246509567,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.03376458246509567\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098616,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098616\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164525,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164525\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49230769230769234,\n \"acc_stderr\": 0.02534800603153477,\n\
\ \"acc_norm\": 0.49230769230769234,\n \"acc_norm_stderr\": 0.02534800603153477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.744954128440367,\n \"acc_stderr\": 0.01868850085653583,\n \"acc_norm\"\
: 0.744954128440367,\n \"acc_norm_stderr\": 0.01868850085653583\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n\
\ \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n\
\ \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115071,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598025,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598025\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.024904439098918214,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.024904439098918214\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n\
\ \"acc_stderr\": 0.015274685213734195,\n \"acc_norm\": 0.7598978288633461,\n\
\ \"acc_norm_stderr\": 0.015274685213734195\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016117,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016117\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n\
\ \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n\
\ \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.02830457667314111,\n\
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.02830457667314111\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192717,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192717\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4322033898305085,\n\
\ \"acc_stderr\": 0.012652297777114968,\n \"acc_norm\": 0.4322033898305085,\n\
\ \"acc_norm_stderr\": 0.012652297777114968\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03004261583271486,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03004261583271486\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5637254901960784,\n \"acc_stderr\": 0.020062874243539128,\n \
\ \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.020062874243539128\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.01543821111952251,\n \"mc2\": 0.40124945157279795,\n\
\ \"mc2_stderr\": 0.013978095978126583\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-32-44.417888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-32-44.417888.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-32-44.417888.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-32-44.417888.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_32_44.417888
path:
- results_2023-10-01T14-32-44.417888.parquet
- split: latest
path:
- results_2023-10-01T14-32-44.417888.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:32:44.417888](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-q_k_v_o/blob/main/results_2023-10-01T14-32-44.417888.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5507671171020679,
"acc_stderr": 0.03452836374060718,
"acc_norm": 0.5551144278998122,
"acc_norm_stderr": 0.03450922754729939,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.01543821111952251,
"mc2": 0.40124945157279795,
"mc2_stderr": 0.013978095978126583
},
"harness|arc:challenge|25": {
"acc": 0.514505119453925,
"acc_stderr": 0.014605241081370053,
"acc_norm": 0.560580204778157,
"acc_norm_stderr": 0.014503747823580125
},
"harness|hellaswag|10": {
"acc": 0.6084445329615614,
"acc_stderr": 0.00487100593940747,
"acc_norm": 0.8188607847042422,
"acc_norm_stderr": 0.0038434637920379106
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009794,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009794
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278006,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278006
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842507,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842507
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871137,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871137
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957536,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957536
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.03289477330098616,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.03289477330098616
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164525,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164525
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49230769230769234,
"acc_stderr": 0.02534800603153477,
"acc_norm": 0.49230769230769234,
"acc_norm_stderr": 0.02534800603153477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.744954128440367,
"acc_stderr": 0.01868850085653583,
"acc_norm": 0.744954128440367,
"acc_norm_stderr": 0.01868850085653583
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.024904439098918214,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.024904439098918214
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7598978288633461,
"acc_stderr": 0.015274685213734195,
"acc_norm": 0.7598978288633461,
"acc_norm_stderr": 0.015274685213734195
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016117,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016117
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.016277927039638193,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.016277927039638193
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.02830457667314111,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.02830457667314111
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192717,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192717
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4322033898305085,
"acc_stderr": 0.012652297777114968,
"acc_norm": 0.4322033898305085,
"acc_norm_stderr": 0.012652297777114968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03004261583271486,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03004261583271486
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.020062874243539128,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.020062874243539128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.01543821111952251,
"mc2": 0.40124945157279795,
"mc2_stderr": 0.013978095978126583
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Linly-AI__Chinese-LLaMA-2-7B-hf | 2023-10-01T14:36:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Linly-AI/Chinese-LLaMA-2-7B-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Linly-AI/Chinese-LLaMA-2-7B-hf](https://huggingface.co/Linly-AI/Chinese-LLaMA-2-7B-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Linly-AI__Chinese-LLaMA-2-7B-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:35:23.324449](https://huggingface.co/datasets/open-llm-leaderboard/details_Linly-AI__Chinese-LLaMA-2-7B-hf/blob/main/results_2023-10-01T14-35-23.324449.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3552072882482051,\n\
\ \"acc_stderr\": 0.0341565610685841,\n \"acc_norm\": 0.3590741042531357,\n\
\ \"acc_norm_stderr\": 0.0341485904255498,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.39920341539069243,\n\
\ \"mc2_stderr\": 0.013913947924718102\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.44283276450511944,\n \"acc_stderr\": 0.014515573873348895,\n\
\ \"acc_norm\": 0.4803754266211604,\n \"acc_norm_stderr\": 0.014600132075947085\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5419239195379406,\n\
\ \"acc_stderr\": 0.004972210244020564,\n \"acc_norm\": 0.7325234017128062,\n\
\ \"acc_norm_stderr\": 0.00441738410239868\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.33584905660377357,\n \"acc_stderr\": 0.029067220146644823,\n\
\ \"acc_norm\": 0.33584905660377357,\n \"acc_norm_stderr\": 0.029067220146644823\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\
\ \"acc_stderr\": 0.033917503223216586,\n \"acc_norm\": 0.27167630057803466,\n\
\ \"acc_norm_stderr\": 0.033917503223216586\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.03013590647851756,\n\
\ \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.03013590647851756\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003337,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003337\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2328042328042328,\n \"acc_stderr\": 0.02176596167215452,\n \"\
acc_norm\": 0.2328042328042328,\n \"acc_norm_stderr\": 0.02176596167215452\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.037649508797906066,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.037649508797906066\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36774193548387096,\n\
\ \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.36774193548387096,\n\
\ \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.40606060606060607,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.40606060606060607,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3838383838383838,\n \"acc_stderr\": 0.03464881675016338,\n \"\
acc_norm\": 0.3838383838383838,\n \"acc_norm_stderr\": 0.03464881675016338\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.49222797927461137,\n \"acc_stderr\": 0.03608003225569653,\n\
\ \"acc_norm\": 0.49222797927461137,\n \"acc_norm_stderr\": 0.03608003225569653\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.023507579020645337,\n\
\ \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.023507579020645337\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3025210084033613,\n \"acc_stderr\": 0.029837962388291922,\n\
\ \"acc_norm\": 0.3025210084033613,\n \"acc_norm_stderr\": 0.029837962388291922\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.42385321100917434,\n \"acc_stderr\": 0.02118726320908751,\n \"\
acc_norm\": 0.42385321100917434,\n \"acc_norm_stderr\": 0.02118726320908751\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.02769691071309394,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.02769691071309394\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4264705882352941,\n \"acc_stderr\": 0.034711579079534254,\n \"\
acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.034711579079534254\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.45569620253164556,\n \"acc_stderr\": 0.032419206846933335,\n \
\ \"acc_norm\": 0.45569620253164556,\n \"acc_norm_stderr\": 0.032419206846933335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4170403587443946,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.4170403587443946,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3893129770992366,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.3893129770992366,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4214876033057851,\n \"acc_stderr\": 0.045077322787750944,\n \"\
acc_norm\": 0.4214876033057851,\n \"acc_norm_stderr\": 0.045077322787750944\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.38650306748466257,\n \"acc_stderr\": 0.038258255488486076,\n\
\ \"acc_norm\": 0.38650306748466257,\n \"acc_norm_stderr\": 0.038258255488486076\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.047504583990416946,\n\
\ \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.047504583990416946\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5256410256410257,\n\
\ \"acc_stderr\": 0.03271298896811159,\n \"acc_norm\": 0.5256410256410257,\n\
\ \"acc_norm_stderr\": 0.03271298896811159\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5044699872286079,\n\
\ \"acc_stderr\": 0.01787924897058436,\n \"acc_norm\": 0.5044699872286079,\n\
\ \"acc_norm_stderr\": 0.01787924897058436\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.026362437574546538,\n\
\ \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.026362437574546538\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3758169934640523,\n \"acc_stderr\": 0.027732834353363947,\n\
\ \"acc_norm\": 0.3758169934640523,\n \"acc_norm_stderr\": 0.027732834353363947\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40836012861736337,\n\
\ \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.40836012861736337,\n\
\ \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4012345679012346,\n \"acc_stderr\": 0.027272582849839792,\n\
\ \"acc_norm\": 0.4012345679012346,\n \"acc_norm_stderr\": 0.027272582849839792\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343957,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343957\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2920469361147327,\n\
\ \"acc_stderr\": 0.01161334913627182,\n \"acc_norm\": 0.2920469361147327,\n\
\ \"acc_norm_stderr\": 0.01161334913627182\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.24632352941176472,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.24632352941176472,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.36437908496732024,\n \"acc_stderr\": 0.01946951822157369,\n \
\ \"acc_norm\": 0.36437908496732024,\n \"acc_norm_stderr\": 0.01946951822157369\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.39090909090909093,\n\
\ \"acc_stderr\": 0.04673752333670237,\n \"acc_norm\": 0.39090909090909093,\n\
\ \"acc_norm_stderr\": 0.04673752333670237\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3183673469387755,\n \"acc_stderr\": 0.029822533793982062,\n\
\ \"acc_norm\": 0.3183673469387755,\n \"acc_norm_stderr\": 0.029822533793982062\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4527363184079602,\n\
\ \"acc_stderr\": 0.03519702717576915,\n \"acc_norm\": 0.4527363184079602,\n\
\ \"acc_norm_stderr\": 0.03519702717576915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.03829509868994727,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.03829509868994727\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.39920341539069243,\n\
\ \"mc2_stderr\": 0.013913947924718102\n }\n}\n```"
repo_url: https://huggingface.co/Linly-AI/Chinese-LLaMA-2-7B-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-23.324449.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-23.324449.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-35-23.324449.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-35-23.324449.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_35_23.324449
path:
- results_2023-10-01T14-35-23.324449.parquet
- split: latest
path:
- results_2023-10-01T14-35-23.324449.parquet
---
# Dataset Card for Evaluation run of Linly-AI/Chinese-LLaMA-2-7B-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Linly-AI/Chinese-LLaMA-2-7B-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Linly-AI/Chinese-LLaMA-2-7B-hf](https://huggingface.co/Linly-AI/Chinese-LLaMA-2-7B-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Linly-AI__Chinese-LLaMA-2-7B-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:35:23.324449](https://huggingface.co/datasets/open-llm-leaderboard/details_Linly-AI__Chinese-LLaMA-2-7B-hf/blob/main/results_2023-10-01T14-35-23.324449.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3552072882482051,
"acc_stderr": 0.0341565610685841,
"acc_norm": 0.3590741042531357,
"acc_norm_stderr": 0.0341485904255498,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.39920341539069243,
"mc2_stderr": 0.013913947924718102
},
"harness|arc:challenge|25": {
"acc": 0.44283276450511944,
"acc_stderr": 0.014515573873348895,
"acc_norm": 0.4803754266211604,
"acc_norm_stderr": 0.014600132075947085
},
"harness|hellaswag|10": {
"acc": 0.5419239195379406,
"acc_stderr": 0.004972210244020564,
"acc_norm": 0.7325234017128062,
"acc_norm_stderr": 0.00441738410239868
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33584905660377357,
"acc_stderr": 0.029067220146644823,
"acc_norm": 0.33584905660377357,
"acc_norm_stderr": 0.029067220146644823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.033917503223216586,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.033917503223216586
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.03013590647851756,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.03013590647851756
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003337,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003337
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2328042328042328,
"acc_stderr": 0.02176596167215452,
"acc_norm": 0.2328042328042328,
"acc_norm_stderr": 0.02176596167215452
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906066,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906066
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36774193548387096,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.36774193548387096,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.40606060606060607,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.40606060606060607,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3838383838383838,
"acc_stderr": 0.03464881675016338,
"acc_norm": 0.3838383838383838,
"acc_norm_stderr": 0.03464881675016338
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.49222797927461137,
"acc_stderr": 0.03608003225569653,
"acc_norm": 0.49222797927461137,
"acc_norm_stderr": 0.03608003225569653
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.023507579020645337,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.023507579020645337
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3025210084033613,
"acc_stderr": 0.029837962388291922,
"acc_norm": 0.3025210084033613,
"acc_norm_stderr": 0.029837962388291922
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.42385321100917434,
"acc_stderr": 0.02118726320908751,
"acc_norm": 0.42385321100917434,
"acc_norm_stderr": 0.02118726320908751
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.02769691071309394,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.02769691071309394
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.034711579079534254,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.034711579079534254
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.45569620253164556,
"acc_stderr": 0.032419206846933335,
"acc_norm": 0.45569620253164556,
"acc_norm_stderr": 0.032419206846933335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4170403587443946,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.4170403587443946,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3893129770992366,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.3893129770992366,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4214876033057851,
"acc_stderr": 0.045077322787750944,
"acc_norm": 0.4214876033057851,
"acc_norm_stderr": 0.045077322787750944
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.38650306748466257,
"acc_stderr": 0.038258255488486076,
"acc_norm": 0.38650306748466257,
"acc_norm_stderr": 0.038258255488486076
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.3592233009708738,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.3592233009708738,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.03271298896811159,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.03271298896811159
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5044699872286079,
"acc_stderr": 0.01787924897058436,
"acc_norm": 0.5044699872286079,
"acc_norm_stderr": 0.01787924897058436
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.026362437574546538,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.026362437574546538
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3758169934640523,
"acc_stderr": 0.027732834353363947,
"acc_norm": 0.3758169934640523,
"acc_norm_stderr": 0.027732834353363947
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40836012861736337,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.40836012861736337,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4012345679012346,
"acc_stderr": 0.027272582849839792,
"acc_norm": 0.4012345679012346,
"acc_norm_stderr": 0.027272582849839792
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.026891709428343957,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.026891709428343957
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2920469361147327,
"acc_stderr": 0.01161334913627182,
"acc_norm": 0.2920469361147327,
"acc_norm_stderr": 0.01161334913627182
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.24632352941176472,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.24632352941176472,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.36437908496732024,
"acc_stderr": 0.01946951822157369,
"acc_norm": 0.36437908496732024,
"acc_norm_stderr": 0.01946951822157369
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.39090909090909093,
"acc_stderr": 0.04673752333670237,
"acc_norm": 0.39090909090909093,
"acc_norm_stderr": 0.04673752333670237
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3183673469387755,
"acc_stderr": 0.029822533793982062,
"acc_norm": 0.3183673469387755,
"acc_norm_stderr": 0.029822533793982062
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4527363184079602,
"acc_stderr": 0.03519702717576915,
"acc_norm": 0.4527363184079602,
"acc_norm_stderr": 0.03519702717576915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.03829509868994727,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.03829509868994727
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.39920341539069243,
"mc2_stderr": 0.013913947924718102
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down | 2023-10-01T14:37:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:35:45.846264](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down/blob/main/results_2023-10-01T14-35-45.846264.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.540748084204488,\n\
\ \"acc_stderr\": 0.0346563838454107,\n \"acc_norm\": 0.5449223150311865,\n\
\ \"acc_norm_stderr\": 0.034637127769711853,\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.015528566637087291,\n \"mc2\": 0.39660004998059784,\n\
\ \"mc2_stderr\": 0.014022253129284996\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5332764505119454,\n \"acc_stderr\": 0.01457899585960581,\n\
\ \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650649\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6108344951204939,\n\
\ \"acc_stderr\": 0.004865645485910422,\n \"acc_norm\": 0.8178649671380203,\n\
\ \"acc_norm_stderr\": 0.0038516699346338884\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955784,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300642,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300642\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.025334667080954935,\n\
\ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.025334667080954935\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131137,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131137\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.032284106267163895,\n\
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.032284106267163895\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7211009174311926,\n\
\ \"acc_stderr\": 0.01922746887646351,\n \"acc_norm\": 0.7211009174311926,\n\
\ \"acc_norm_stderr\": 0.01922746887646351\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696041,\n\
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696041\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.0284588209914603,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.0284588209914603\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.027421007295392912,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.027421007295392912\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395953,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395953\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3329608938547486,\n\
\ \"acc_stderr\": 0.015761716178397556,\n \"acc_norm\": 0.3329608938547486,\n\
\ \"acc_norm_stderr\": 0.015761716178397556\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.028332397483664278,\n\
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.028332397483664278\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804015,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804015\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.02677492989972232,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.02677492989972232\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41003911342894395,\n\
\ \"acc_stderr\": 0.01256183762196204,\n \"acc_norm\": 0.41003911342894395,\n\
\ \"acc_norm_stderr\": 0.01256183762196204\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455502,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455502\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886528,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670237,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670237\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.032941184790540944,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.032941184790540944\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.015528566637087291,\n \"mc2\": 0.39660004998059784,\n\
\ \"mc2_stderr\": 0.014022253129284996\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-45.846264.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-35-45.846264.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-35-45.846264.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-35-45.846264.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_35_45.846264
path:
- results_2023-10-01T14-35-45.846264.parquet
- split: latest
path:
- results_2023-10-01T14-35-45.846264.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:35:45.846264](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r8-gate_up_down/blob/main/results_2023-10-01T14-35-45.846264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.540748084204488,
"acc_stderr": 0.0346563838454107,
"acc_norm": 0.5449223150311865,
"acc_norm_stderr": 0.034637127769711853,
"mc1": 0.2692778457772338,
"mc1_stderr": 0.015528566637087291,
"mc2": 0.39660004998059784,
"mc2_stderr": 0.014022253129284996
},
"harness|arc:challenge|25": {
"acc": 0.5332764505119454,
"acc_stderr": 0.01457899585960581,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650649
},
"harness|hellaswag|10": {
"acc": 0.6108344951204939,
"acc_stderr": 0.004865645485910422,
"acc_norm": 0.8178649671380203,
"acc_norm_stderr": 0.0038516699346338884
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955784,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300642,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300642
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438803,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438803
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.025334667080954935,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.025334667080954935
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131137,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131137
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.032284106267163895,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.032284106267163895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.01922746887646351,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.01922746887646351
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696041,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696041
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.0284588209914603,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.0284588209914603
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.027421007295392912,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.027421007295392912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395953,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395953
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3329608938547486,
"acc_stderr": 0.015761716178397556,
"acc_norm": 0.3329608938547486,
"acc_norm_stderr": 0.015761716178397556
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.028332397483664278,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.028332397483664278
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804015,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804015
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.02677492989972232,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.02677492989972232
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41003911342894395,
"acc_stderr": 0.01256183762196204,
"acc_norm": 0.41003911342894395,
"acc_norm_stderr": 0.01256183762196204
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455502,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455502
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886528,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670237,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670237
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.032941184790540944,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.032941184790540944
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2692778457772338,
"mc1_stderr": 0.015528566637087291,
"mc2": 0.39660004998059784,
"mc2_stderr": 0.014022253129284996
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o | 2023-10-01T14:41:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:39:43.645345](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o/blob/main/results_2023-10-01T14-39-43.645345.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5563948827229734,\n\
\ \"acc_stderr\": 0.03434363879336393,\n \"acc_norm\": 0.5607780987147378,\n\
\ \"acc_norm_stderr\": 0.03432358169297833,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.3812865521559485,\n\
\ \"mc2_stderr\": 0.01383203045686917\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.014555949760496442,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009124\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.602867954590719,\n\
\ \"acc_stderr\": 0.004883037758919966,\n \"acc_norm\": 0.8119896434973113,\n\
\ \"acc_norm_stderr\": 0.0038992191786572307\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009798,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009798\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.03812400565974833,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.03812400565974833\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374768,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374768\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49230769230769234,\n \"acc_stderr\": 0.025348006031534778,\n\
\ \"acc_norm\": 0.49230769230769234,\n \"acc_norm_stderr\": 0.025348006031534778\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7522935779816514,\n\
\ \"acc_stderr\": 0.018508143602547825,\n \"acc_norm\": 0.7522935779816514,\n\
\ \"acc_norm_stderr\": 0.018508143602547825\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n\
\ \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460295,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\
\ \"acc_stderr\": 0.01501688469853988,\n \"acc_norm\": 0.7713920817369093,\n\
\ \"acc_norm_stderr\": 0.01501688469853988\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030457,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030457\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\
\ \"acc_stderr\": 0.015551673652172542,\n \"acc_norm\": 0.31620111731843575,\n\
\ \"acc_norm_stderr\": 0.015551673652172542\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.02807415894760065,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.02807415894760065\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.0266756119260371,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.0266756119260371\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4132985658409387,\n\
\ \"acc_stderr\": 0.012576779494860087,\n \"acc_norm\": 0.4132985658409387,\n\
\ \"acc_norm_stderr\": 0.012576779494860087\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.01999797303545833,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.01999797303545833\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235933,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235933\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296024,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.3812865521559485,\n\
\ \"mc2_stderr\": 0.01383203045686917\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-39-43.645345.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-39-43.645345.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-39-43.645345.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-39-43.645345.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_39_43.645345
path:
- results_2023-10-01T14-39-43.645345.parquet
- split: latest
path:
- results_2023-10-01T14-39-43.645345.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:39:43.645345](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-q_k_v_o/blob/main/results_2023-10-01T14-39-43.645345.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5563948827229734,
"acc_stderr": 0.03434363879336393,
"acc_norm": 0.5607780987147378,
"acc_norm_stderr": 0.03432358169297833,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.3812865521559485,
"mc2_stderr": 0.01383203045686917
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.014555949760496442,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009124
},
"harness|hellaswag|10": {
"acc": 0.602867954590719,
"acc_stderr": 0.004883037758919966,
"acc_norm": 0.8119896434973113,
"acc_norm_stderr": 0.0038992191786572307
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009798,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009798
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.03812400565974833,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.03812400565974833
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374768,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374768
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149138,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149138
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49230769230769234,
"acc_stderr": 0.025348006031534778,
"acc_norm": 0.49230769230769234,
"acc_norm_stderr": 0.025348006031534778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606647,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547825,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547825
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6319018404907976,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.6319018404907976,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7713920817369093,
"acc_stderr": 0.01501688469853988,
"acc_norm": 0.7713920817369093,
"acc_norm_stderr": 0.01501688469853988
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030457,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030457
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172542,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172542
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.02807415894760065,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.02807415894760065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.0266756119260371,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.0266756119260371
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4132985658409387,
"acc_stderr": 0.012576779494860087,
"acc_norm": 0.4132985658409387,
"acc_norm_stderr": 0.012576779494860087
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.01999797303545833,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.01999797303545833
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235933,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235933
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296024,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.3812865521559485,
"mc2_stderr": 0.01383203045686917
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down | 2023-10-01T14:44:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:43:15.199560](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down/blob/main/results_2023-10-01T14-43-15.199560.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5613301889355844,\n\
\ \"acc_stderr\": 0.03445292365989304,\n \"acc_norm\": 0.5656165075635413,\n\
\ \"acc_norm_stderr\": 0.03443270213125714,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237028,\n \"mc2\": 0.3895415687109435,\n\
\ \"mc2_stderr\": 0.014030033637085404\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5426621160409556,\n \"acc_stderr\": 0.014558106543924065,\n\
\ \"acc_norm\": 0.5870307167235495,\n \"acc_norm_stderr\": 0.014388344935398326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.610336586337383,\n\
\ \"acc_stderr\": 0.004866772373029928,\n \"acc_norm\": 0.8188607847042422,\n\
\ \"acc_norm_stderr\": 0.0038434637920379145\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.030052580579557845,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.030052580579557845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400352,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400352\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.02487081525105709,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02487081525105709\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.027218889773308767,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.027218889773308767\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538095,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538095\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.043389203057924,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.043389203057924\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922737,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922737\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398687,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398687\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.025522474632121615,\n\
\ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.025522474632121615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n\
\ \"acc_stderr\": 0.015961036675230973,\n \"acc_norm\": 0.35083798882681566,\n\
\ \"acc_norm_stderr\": 0.015961036675230973\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804015,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804015\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\
\ \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n\
\ \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5702614379084967,\n \"acc_stderr\": 0.02002712278492854,\n \
\ \"acc_norm\": 0.5702614379084967,\n \"acc_norm_stderr\": 0.02002712278492854\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5755102040816327,\n \"acc_stderr\": 0.03164209487942942,\n\
\ \"acc_norm\": 0.5755102040816327,\n \"acc_norm_stderr\": 0.03164209487942942\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237028,\n \"mc2\": 0.3895415687109435,\n\
\ \"mc2_stderr\": 0.014030033637085404\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-43-15.199560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-43-15.199560.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-43-15.199560.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-43-15.199560.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_43_15.199560
path:
- results_2023-10-01T14-43-15.199560.parquet
- split: latest
path:
- results_2023-10-01T14-43-15.199560.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:43:15.199560](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r16-gate_up_down/blob/main/results_2023-10-01T14-43-15.199560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5613301889355844,
"acc_stderr": 0.03445292365989304,
"acc_norm": 0.5656165075635413,
"acc_norm_stderr": 0.03443270213125714,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237028,
"mc2": 0.3895415687109435,
"mc2_stderr": 0.014030033637085404
},
"harness|arc:challenge|25": {
"acc": 0.5426621160409556,
"acc_stderr": 0.014558106543924065,
"acc_norm": 0.5870307167235495,
"acc_norm_stderr": 0.014388344935398326
},
"harness|hellaswag|10": {
"acc": 0.610336586337383,
"acc_stderr": 0.004866772373029928,
"acc_norm": 0.8188607847042422,
"acc_norm_stderr": 0.0038434637920379145
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.030052580579557845,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.030052580579557845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.038118909889404126,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.038118909889404126
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400352,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400352
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02487081525105709,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02487081525105709
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.027218889773308767,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.027218889773308767
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538095,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538095
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.030031147977641538,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.030031147977641538
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.043389203057924,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.043389203057924
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922737,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922737
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398687,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398687
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.025522474632121615,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.025522474632121615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.015961036675230973,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.015961036675230973
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804015,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804015
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5702614379084967,
"acc_stderr": 0.02002712278492854,
"acc_norm": 0.5702614379084967,
"acc_norm_stderr": 0.02002712278492854
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5755102040816327,
"acc_stderr": 0.03164209487942942,
"acc_norm": 0.5755102040816327,
"acc_norm_stderr": 0.03164209487942942
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237028,
"mc2": 0.3895415687109435,
"mc2_stderr": 0.014030033637085404
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
taufiqtad/your-dataset-name | 2023-10-01T14:44:26.000Z | [
"region:us"
] | taufiqtad | null | null | null | 0 | 0 | Entry not found |
taufiqtad/taufiqtest | 2023-10-01T14:49:53.000Z | [
"region:us"
] | taufiqtad | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o | 2023-10-01T14:47:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:46:22.924681](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o/blob/main/results_2023-10-01T14-46-22.924681.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5478354761212929,\n\
\ \"acc_stderr\": 0.0346740589555204,\n \"acc_norm\": 0.5518226635259099,\n\
\ \"acc_norm_stderr\": 0.034656332661251746,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4102046930159844,\n\
\ \"mc2_stderr\": 0.014107899732834902\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5187713310580204,\n \"acc_stderr\": 0.014601090150633962,\n\
\ \"acc_norm\": 0.5477815699658704,\n \"acc_norm_stderr\": 0.014544519880633822\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6077474606652061,\n\
\ \"acc_stderr\": 0.004872546302641849,\n \"acc_norm\": 0.813981278629755,\n\
\ \"acc_norm_stderr\": 0.0038832652107917047\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791194,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791194\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.02459497512892093,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02459497512892093\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.027104826328100944,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.027104826328100944\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230182,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230182\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501624,\n \"\
acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.041733491480835,\n \"acc_norm\"\
: 0.7024793388429752,\n \"acc_norm_stderr\": 0.041733491480835\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922744,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922744\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7458492975734355,\n\
\ \"acc_stderr\": 0.015569254692045755,\n \"acc_norm\": 0.7458492975734355,\n\
\ \"acc_norm_stderr\": 0.015569254692045755\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.026329813341946243,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.026329813341946243\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176636,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176636\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.027513925683549434,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.027513925683549434\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.02686949074481526,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.02686949074481526\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4172099087353325,\n\
\ \"acc_stderr\": 0.012593959992906419,\n \"acc_norm\": 0.4172099087353325,\n\
\ \"acc_norm_stderr\": 0.012593959992906419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5637254901960784,\n \"acc_stderr\": 0.02006287424353913,\n \
\ \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.02006287424353913\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4102046930159844,\n\
\ \"mc2_stderr\": 0.014107899732834902\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-46-22.924681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-46-22.924681.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-46-22.924681.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-46-22.924681.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_46_22.924681
path:
- results_2023-10-01T14-46-22.924681.parquet
- split: latest
path:
- results_2023-10-01T14-46-22.924681.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:46:22.924681](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o/blob/main/results_2023-10-01T14-46-22.924681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5478354761212929,
"acc_stderr": 0.0346740589555204,
"acc_norm": 0.5518226635259099,
"acc_norm_stderr": 0.034656332661251746,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4102046930159844,
"mc2_stderr": 0.014107899732834902
},
"harness|arc:challenge|25": {
"acc": 0.5187713310580204,
"acc_stderr": 0.014601090150633962,
"acc_norm": 0.5477815699658704,
"acc_norm_stderr": 0.014544519880633822
},
"harness|hellaswag|10": {
"acc": 0.6077474606652061,
"acc_stderr": 0.004872546302641849,
"acc_norm": 0.813981278629755,
"acc_norm_stderr": 0.0038832652107917047
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02459497512892093,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02459497512892093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230182,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230182
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501624,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.041733491480835,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.041733491480835
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6319018404907976,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.6319018404907976,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922744,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7458492975734355,
"acc_stderr": 0.015569254692045755,
"acc_norm": 0.7458492975734355,
"acc_norm_stderr": 0.015569254692045755
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.026329813341946243,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.026329813341946243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.028431095444176636,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.028431095444176636
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.027513925683549434,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.027513925683549434
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.02686949074481526,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.02686949074481526
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4172099087353325,
"acc_stderr": 0.012593959992906419,
"acc_norm": 0.4172099087353325,
"acc_norm_stderr": 0.012593959992906419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.02006287424353913,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.02006287424353913
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4102046930159844,
"mc2_stderr": 0.014107899732834902
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down | 2023-10-01T14:51:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:49:36.997866](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down/blob/main/results_2023-10-01T14-49-36.997866.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5512568945460414,\n\
\ \"acc_stderr\": 0.03444971051410127,\n \"acc_norm\": 0.5555268331933795,\n\
\ \"acc_norm_stderr\": 0.03443091543990435,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.015572840452875833,\n \"mc2\": 0.39116593699627455,\n\
\ \"mc2_stderr\": 0.013981572285112515\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5162116040955631,\n \"acc_stderr\": 0.014603708567414947,\n\
\ \"acc_norm\": 0.5580204778156996,\n \"acc_norm_stderr\": 0.014512682523128342\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6072495518820952,\n\
\ \"acc_stderr\": 0.0048736401847734425,\n \"acc_norm\": 0.8173670583549094,\n\
\ \"acc_norm_stderr\": 0.00385575685144154\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115208,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115208\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047736,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047736\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278243,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278243\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.02794045713622842,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.02794045713622842\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547832,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547832\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530645,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530645\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.031660096793998116,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.031660096793998116\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.01546467616339596,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.01546467616339596\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242832,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242832\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.028332397483664278,\n\
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.028332397483664278\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722327,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558555,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n\
\ \"acc_stderr\": 0.012638223880313172,\n \"acc_norm\": 0.4282920469361147,\n\
\ \"acc_norm_stderr\": 0.012638223880313172\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5686274509803921,\n \"acc_stderr\": 0.020036393768352638,\n \
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.020036393768352638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014638,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014638\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533197,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533197\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.015572840452875833,\n \"mc2\": 0.39116593699627455,\n\
\ \"mc2_stderr\": 0.013981572285112515\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-49-36.997866.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-49-36.997866.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-49-36.997866.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-49-36.997866.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_49_36.997866
path:
- results_2023-10-01T14-49-36.997866.parquet
- split: latest
path:
- results_2023-10-01T14-49-36.997866.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:49:36.997866](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-gate_up_down/blob/main/results_2023-10-01T14-49-36.997866.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5512568945460414,
"acc_stderr": 0.03444971051410127,
"acc_norm": 0.5555268331933795,
"acc_norm_stderr": 0.03443091543990435,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875833,
"mc2": 0.39116593699627455,
"mc2_stderr": 0.013981572285112515
},
"harness|arc:challenge|25": {
"acc": 0.5162116040955631,
"acc_stderr": 0.014603708567414947,
"acc_norm": 0.5580204778156996,
"acc_norm_stderr": 0.014512682523128342
},
"harness|hellaswag|10": {
"acc": 0.6072495518820952,
"acc_stderr": 0.0048736401847734425,
"acc_norm": 0.8173670583549094,
"acc_norm_stderr": 0.00385575685144154
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278243,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278243
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622842,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622842
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547832,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547832
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.033723432716530645,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.033723432716530645
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.031660096793998116,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.031660096793998116
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.01546467616339596,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.01546467616339596
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242832,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242832
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.028332397483664278,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.028332397483664278
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722327,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558555,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4282920469361147,
"acc_stderr": 0.012638223880313172,
"acc_norm": 0.4282920469361147,
"acc_norm_stderr": 0.012638223880313172
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.020036393768352638,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.020036393768352638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014638,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014638
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533197,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533197
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875833,
"mc2": 0.39116593699627455,
"mc2_stderr": 0.013981572285112515
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gfgfhgttr5/guanaco-llama2-1k | 2023-10-01T14:50:07.000Z | [
"region:us"
] | gfgfhgttr5 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nikchar/retrieval_verification_squeezebert_v2 | 2023-10-01T14:51:18.000Z | [
"region:us"
] | nikchar | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: text
dtype: string
- name: retrieved_evidence_title
sequence: string
- name: retrieved_evidence_text
sequence: string
- name: labels
dtype: int64
- name: Retrieval_Success
dtype: bool
- name: Predicted_Labels
dtype: int64
- name: Predicted_Labels_Each_doc
sequence: int64
splits:
- name: train
num_bytes: 73601741
num_examples: 11073
download_size: 34426481
dataset_size: 73601741
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "retrieval_verification_squeezebert_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o | 2023-10-01T14:54:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:53:03.540096](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o/blob/main/results_2023-10-01T14-53-03.540096.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5502243877604182,\n\
\ \"acc_stderr\": 0.034495145625879985,\n \"acc_norm\": 0.5545285437753931,\n\
\ \"acc_norm_stderr\": 0.03447532943540996,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4130693548141457,\n\
\ \"mc2_stderr\": 0.014222336277496334\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5315699658703071,\n \"acc_stderr\": 0.014582236460866973,\n\
\ \"acc_norm\": 0.5767918088737202,\n \"acc_norm_stderr\": 0.014438036220848036\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.610336586337383,\n\
\ \"acc_stderr\": 0.004866772373029929,\n \"acc_norm\": 0.8190599482174865,\n\
\ \"acc_norm_stderr\": 0.0038418173753171935\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009798,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009798\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.038118909889404105,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.038118909889404105\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028417,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6258064516129033,\n \"acc_stderr\": 0.027528904299845704,\n \"\
acc_norm\": 0.6258064516129033,\n \"acc_norm_stderr\": 0.027528904299845704\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n \"\
acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608466,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608466\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.032284106267163895,\n\
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.032284106267163895\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416406,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416406\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070415,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070415\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395962,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395962\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.016513676031179595,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.016513676031179595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.02807415894760065,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.02807415894760065\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.02695934451874778,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.02695934451874778\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558555,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n\
\ \"acc_stderr\": 0.012585471793400662,\n \"acc_norm\": 0.4152542372881356,\n\
\ \"acc_norm_stderr\": 0.012585471793400662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.031987615467631264,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.031987615467631264\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.033076159479790326,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.033076159479790326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4130693548141457,\n\
\ \"mc2_stderr\": 0.014222336277496334\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-53-03.540096.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-53-03.540096.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-53-03.540096.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-53-03.540096.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_53_03.540096
path:
- results_2023-10-01T14-53-03.540096.parquet
- split: latest
path:
- results_2023-10-01T14-53-03.540096.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:53:03.540096](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-q_k_v_o/blob/main/results_2023-10-01T14-53-03.540096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5502243877604182,
"acc_stderr": 0.034495145625879985,
"acc_norm": 0.5545285437753931,
"acc_norm_stderr": 0.03447532943540996,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4130693548141457,
"mc2_stderr": 0.014222336277496334
},
"harness|arc:challenge|25": {
"acc": 0.5315699658703071,
"acc_stderr": 0.014582236460866973,
"acc_norm": 0.5767918088737202,
"acc_norm_stderr": 0.014438036220848036
},
"harness|hellaswag|10": {
"acc": 0.610336586337383,
"acc_stderr": 0.004866772373029929,
"acc_norm": 0.8190599482174865,
"acc_norm_stderr": 0.0038418173753171935
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009798,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009798
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.038118909889404105,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.038118909889404105
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028417,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845704,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608466,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608466
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.032284106267163895,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.032284106267163895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.018415286351416406,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.018415286351416406
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070415,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070415
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395962,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395962
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977254,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179595,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.02807415894760065,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.02807415894760065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.02695934451874778,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.02695934451874778
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558555,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4152542372881356,
"acc_stderr": 0.012585471793400662,
"acc_norm": 0.4152542372881356,
"acc_norm_stderr": 0.012585471793400662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.033076159479790326,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.033076159479790326
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4130693548141457,
"mc2_stderr": 0.014222336277496334
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-7b-Chat | 2023-10-01T14:56:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of FlagAlpha/Llama2-Chinese-7b-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FlagAlpha/Llama2-Chinese-7b-Chat](https://huggingface.co/FlagAlpha/Llama2-Chinese-7b-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-7b-Chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:55:21.985751](https://huggingface.co/datasets/open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-7b-Chat/blob/main/results_2023-10-01T14-55-21.985751.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4789900071561118,\n\
\ \"acc_stderr\": 0.03514490754147888,\n \"acc_norm\": 0.4830378814240372,\n\
\ \"acc_norm_stderr\": 0.03513182000347704,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144905,\n \"mc2\": 0.4686795780185653,\n\
\ \"mc2_stderr\": 0.015229887658311217\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48293515358361777,\n \"acc_stderr\": 0.014602878388536598,\n\
\ \"acc_norm\": 0.5238907849829352,\n \"acc_norm_stderr\": 0.014594701798071654\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5773750248954391,\n\
\ \"acc_stderr\": 0.004929672777184316,\n \"acc_norm\": 0.7752439753037244,\n\
\ \"acc_norm_stderr\": 0.004165684625540422\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.032081157507886836,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5451612903225806,\n \"acc_stderr\": 0.028327743091561074,\n \"\
acc_norm\": 0.5451612903225806,\n \"acc_norm_stderr\": 0.028327743091561074\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"\
acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"\
acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.032922966391551414,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.032922966391551414\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4205128205128205,\n \"acc_stderr\": 0.025028610276710862,\n\
\ \"acc_norm\": 0.4205128205128205,\n \"acc_norm_stderr\": 0.025028610276710862\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6385321100917432,\n \"acc_stderr\": 0.020598082009937374,\n \"\
acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.020598082009937374\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03214952147802751,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03214952147802751\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n \"\
acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6624472573839663,\n \"acc_stderr\": 0.030781549102026223,\n \
\ \"acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.030781549102026223\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5038167938931297,\n \"acc_stderr\": 0.04385162325601553,\n\
\ \"acc_norm\": 0.5038167938931297,\n \"acc_norm_stderr\": 0.04385162325601553\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041696,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041696\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\
\ \"acc_stderr\": 0.02999695185834947,\n \"acc_norm\": 0.7008547008547008,\n\
\ \"acc_norm_stderr\": 0.02999695185834947\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.6781609195402298,\n \"acc_stderr\": 0.0167063814150579,\n\
\ \"acc_norm\": 0.6781609195402298,\n \"acc_norm_stderr\": 0.0167063814150579\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5520231213872833,\n\
\ \"acc_stderr\": 0.02677299065336182,\n \"acc_norm\": 0.5520231213872833,\n\
\ \"acc_norm_stderr\": 0.02677299065336182\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.23128491620111732,\n \"acc_stderr\": 0.014102223623152587,\n\
\ \"acc_norm\": 0.23128491620111732,\n \"acc_norm_stderr\": 0.014102223623152587\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.48366013071895425,\n\
\ \"acc_stderr\": 0.028614624752805413,\n \"acc_norm\": 0.48366013071895425,\n\
\ \"acc_norm_stderr\": 0.028614624752805413\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.5852090032154341,\n \"acc_stderr\": 0.027982680459759553,\n\
\ \"acc_norm\": 0.5852090032154341,\n \"acc_norm_stderr\": 0.027982680459759553\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5339506172839507,\n\
\ \"acc_stderr\": 0.027756535257347663,\n \"acc_norm\": 0.5339506172839507,\n\
\ \"acc_norm_stderr\": 0.027756535257347663\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3409387222946545,\n\
\ \"acc_stderr\": 0.01210681720306721,\n \"acc_norm\": 0.3409387222946545,\n\
\ \"acc_norm_stderr\": 0.01210681720306721\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.03000856284500349,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.03000856284500349\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.477124183006536,\n \"acc_stderr\": 0.020206653187884782,\n \
\ \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.020206653187884782\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46938775510204084,\n \"acc_stderr\": 0.031949171367580624,\n\
\ \"acc_norm\": 0.46938775510204084,\n \"acc_norm_stderr\": 0.031949171367580624\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268814,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268814\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144905,\n \"mc2\": 0.4686795780185653,\n\
\ \"mc2_stderr\": 0.015229887658311217\n }\n}\n```"
repo_url: https://huggingface.co/FlagAlpha/Llama2-Chinese-7b-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-55-21.985751.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-55-21.985751.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-55-21.985751.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-55-21.985751.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_55_21.985751
path:
- results_2023-10-01T14-55-21.985751.parquet
- split: latest
path:
- results_2023-10-01T14-55-21.985751.parquet
---
# Dataset Card for Evaluation run of FlagAlpha/Llama2-Chinese-7b-Chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FlagAlpha/Llama2-Chinese-7b-Chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FlagAlpha/Llama2-Chinese-7b-Chat](https://huggingface.co/FlagAlpha/Llama2-Chinese-7b-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-7b-Chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:55:21.985751](https://huggingface.co/datasets/open-llm-leaderboard/details_FlagAlpha__Llama2-Chinese-7b-Chat/blob/main/results_2023-10-01T14-55-21.985751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4789900071561118,
"acc_stderr": 0.03514490754147888,
"acc_norm": 0.4830378814240372,
"acc_norm_stderr": 0.03513182000347704,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144905,
"mc2": 0.4686795780185653,
"mc2_stderr": 0.015229887658311217
},
"harness|arc:challenge|25": {
"acc": 0.48293515358361777,
"acc_stderr": 0.014602878388536598,
"acc_norm": 0.5238907849829352,
"acc_norm_stderr": 0.014594701798071654
},
"harness|hellaswag|10": {
"acc": 0.5773750248954391,
"acc_stderr": 0.004929672777184316,
"acc_norm": 0.7752439753037244,
"acc_norm_stderr": 0.004165684625540422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5451612903225806,
"acc_stderr": 0.028327743091561074,
"acc_norm": 0.5451612903225806,
"acc_norm_stderr": 0.028327743091561074
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.032922966391551414,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.032922966391551414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4205128205128205,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.4205128205128205,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.020598082009937374,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.020598082009937374
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03214952147802751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03214952147802751
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.033321399446680854,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.033321399446680854
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6624472573839663,
"acc_stderr": 0.030781549102026223,
"acc_norm": 0.6624472573839663,
"acc_norm_stderr": 0.030781549102026223
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5038167938931297,
"acc_stderr": 0.04385162325601553,
"acc_norm": 0.5038167938931297,
"acc_norm_stderr": 0.04385162325601553
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5460122699386503,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.5460122699386503,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041696,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041696
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.02999695185834947,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.02999695185834947
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6781609195402298,
"acc_stderr": 0.0167063814150579,
"acc_norm": 0.6781609195402298,
"acc_norm_stderr": 0.0167063814150579
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5520231213872833,
"acc_stderr": 0.02677299065336182,
"acc_norm": 0.5520231213872833,
"acc_norm_stderr": 0.02677299065336182
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23128491620111732,
"acc_stderr": 0.014102223623152587,
"acc_norm": 0.23128491620111732,
"acc_norm_stderr": 0.014102223623152587
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.028614624752805413,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.028614624752805413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759553,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759553
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.027756535257347663,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.027756535257347663
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3409387222946545,
"acc_stderr": 0.01210681720306721,
"acc_norm": 0.3409387222946545,
"acc_norm_stderr": 0.01210681720306721
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.03000856284500349,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.03000856284500349
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.477124183006536,
"acc_stderr": 0.020206653187884782,
"acc_norm": 0.477124183006536,
"acc_norm_stderr": 0.020206653187884782
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46938775510204084,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.46938775510204084,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144905,
"mc2": 0.4686795780185653,
"mc2_stderr": 0.015229887658311217
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down | 2023-10-01T14:58:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T14:56:49.451812](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down/blob/main/results_2023-10-01T14-56-49.451812.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5535223876713008,\n\
\ \"acc_stderr\": 0.034397489910252724,\n \"acc_norm\": 0.5576804689373647,\n\
\ \"acc_norm_stderr\": 0.03437903752115374,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834562,\n \"mc2\": 0.39596272362402163,\n\
\ \"mc2_stderr\": 0.014138045284348281\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5051194539249146,\n \"acc_stderr\": 0.01461062489030916,\n\
\ \"acc_norm\": 0.5435153583617748,\n \"acc_norm_stderr\": 0.014555949760496444\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6143198566022705,\n\
\ \"acc_stderr\": 0.00485760764116063,\n \"acc_norm\": 0.8212507468631747,\n\
\ \"acc_norm_stderr\": 0.003823591814133032\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899207,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.0275289042998457,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.0275289042998457\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.02763490726417854,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.02763490726417854\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182087,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182087\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547825,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547825\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.033622774366080445,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.033622774366080445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922747,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922747\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101069,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101069\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654082,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654082\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n\
\ \"acc_stderr\": 0.016051419760310263,\n \"acc_norm\": 0.35977653631284917,\n\
\ \"acc_norm_stderr\": 0.016051419760310263\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.02807415894760065,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.02807415894760065\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.02682280175950789,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.02682280175950789\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766002,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766002\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n\
\ \"acc_stderr\": 0.012650007999463878,\n \"acc_norm\": 0.4315514993481095,\n\
\ \"acc_norm_stderr\": 0.012650007999463878\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.03010563657001663,\n\
\ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.03010563657001663\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5735294117647058,\n \"acc_stderr\": 0.020007912739359375,\n \
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.020007912739359375\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834562,\n \"mc2\": 0.39596272362402163,\n\
\ \"mc2_stderr\": 0.014138045284348281\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-56-49.451812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-56-49.451812.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-56-49.451812.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-56-49.451812.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_56_49.451812
path:
- results_2023-10-01T14-56-49.451812.parquet
- split: latest
path:
- results_2023-10-01T14-56-49.451812.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T14:56:49.451812](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r8-gate_up_down/blob/main/results_2023-10-01T14-56-49.451812.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5535223876713008,
"acc_stderr": 0.034397489910252724,
"acc_norm": 0.5576804689373647,
"acc_norm_stderr": 0.03437903752115374,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834562,
"mc2": 0.39596272362402163,
"mc2_stderr": 0.014138045284348281
},
"harness|arc:challenge|25": {
"acc": 0.5051194539249146,
"acc_stderr": 0.01461062489030916,
"acc_norm": 0.5435153583617748,
"acc_norm_stderr": 0.014555949760496444
},
"harness|hellaswag|10": {
"acc": 0.6143198566022705,
"acc_stderr": 0.00485760764116063,
"acc_norm": 0.8212507468631747,
"acc_norm_stderr": 0.003823591814133032
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899207,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.0275289042998457,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.0275289042998457
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.030031147977641538,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.030031147977641538
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.02763490726417854,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.02763490726417854
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182087,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182087
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547825,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547825
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.033622774366080445,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.033622774366080445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922747,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922747
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101069,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101069
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654082,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310263,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.02807415894760065,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.02807415894760065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.02682280175950789,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.02682280175950789
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4315514993481095,
"acc_stderr": 0.012650007999463878,
"acc_norm": 0.4315514993481095,
"acc_norm_stderr": 0.012650007999463878
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.03010563657001663,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.03010563657001663
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.020007912739359375,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.020007912739359375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834562,
"mc2": 0.39596272362402163,
"mc2_stderr": 0.014138045284348281
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o | 2023-10-01T15:01:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T15:00:03.334744](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o/blob/main/results_2023-10-01T15-00-03.334744.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5589348566915583,\n\
\ \"acc_stderr\": 0.034415220347494106,\n \"acc_norm\": 0.5631958579237637,\n\
\ \"acc_norm_stderr\": 0.034396072803843096,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.3975756506862768,\n\
\ \"mc2_stderr\": 0.014052876252966735\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5179180887372014,\n \"acc_stderr\": 0.014602005585490975,\n\
\ \"acc_norm\": 0.5622866894197952,\n \"acc_norm_stderr\": 0.01449757388110829\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6127265484963155,\n\
\ \"acc_stderr\": 0.004861314613286844,\n \"acc_norm\": 0.8197570205138419,\n\
\ \"acc_norm_stderr\": 0.003836041242259808\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854498,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854498\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.034867317274198714,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.0251891498947642,\n \
\ \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.0251891498947642\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7688073394495413,\n \"acc_stderr\": 0.01807575024163315,\n \"\
acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.01807575024163315\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.043642261558410445,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.043642261558410445\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\
\ \"acc_stderr\": 0.015104550008905707,\n \"acc_norm\": 0.7675606641123882,\n\
\ \"acc_norm_stderr\": 0.015104550008905707\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895806,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895806\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n\
\ \"acc_stderr\": 0.01621414875213663,\n \"acc_norm\": 0.3776536312849162,\n\
\ \"acc_norm_stderr\": 0.01621414875213663\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.026981478043648043,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.026981478043648043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.02677492989972233,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.02677492989972233\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594113,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594113\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n\
\ \"acc_stderr\": 0.012618204066588392,\n \"acc_norm\": 0.4230769230769231,\n\
\ \"acc_norm_stderr\": 0.012618204066588392\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935894,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935894\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.03220024104534204,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.03220024104534204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.3975756506862768,\n\
\ \"mc2_stderr\": 0.014052876252966735\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-00-03.334744.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-00-03.334744.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-00-03.334744.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-00-03.334744.parquet'
- config_name: results
data_files:
- split: 2023_10_01T15_00_03.334744
path:
- results_2023-10-01T15-00-03.334744.parquet
- split: latest
path:
- results_2023-10-01T15-00-03.334744.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T15:00:03.334744](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o/blob/main/results_2023-10-01T15-00-03.334744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5589348566915583,
"acc_stderr": 0.034415220347494106,
"acc_norm": 0.5631958579237637,
"acc_norm_stderr": 0.034396072803843096,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.3975756506862768,
"mc2_stderr": 0.014052876252966735
},
"harness|arc:challenge|25": {
"acc": 0.5179180887372014,
"acc_stderr": 0.014602005585490975,
"acc_norm": 0.5622866894197952,
"acc_norm_stderr": 0.01449757388110829
},
"harness|hellaswag|10": {
"acc": 0.6127265484963155,
"acc_stderr": 0.004861314613286844,
"acc_norm": 0.8197570205138419,
"acc_norm_stderr": 0.003836041242259808
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854498,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854498
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.02437319786798306,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.02437319786798306
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.0251891498947642,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.0251891498947642
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.01807575024163315,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.01807575024163315
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.043642261558410445,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.043642261558410445
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.02537213967172293,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.02537213967172293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.015104550008905707,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.015104550008905707
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895806,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895806
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.01621414875213663,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.01621414875213663
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.026981478043648043,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.026981478043648043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.02677492989972233,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.02677492989972233
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594113,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594113
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.012618204066588392,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.012618204066588392
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.03136250240935894,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03136250240935894
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534204,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.3975756506862768,
"mc2_stderr": 0.014052876252966735
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
khalidalt/sungai_ul2_instructions | 2023-10-08T20:17:11.000Z | [
"region:us"
] | khalidalt | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: metadata
struct:
- name: source
dtype: string
splits:
- name: train
num_bytes: 9307
num_examples: 20
download_size: 10340
dataset_size: 9307
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sungai_ul2_instructions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down | 2023-10-01T15:05:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T15:03:43.604689](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down/blob/main/results_2023-10-01T15-03-43.604689.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5663422561116331,\n\
\ \"acc_stderr\": 0.034310991382522825,\n \"acc_norm\": 0.5704277982864568,\n\
\ \"acc_norm_stderr\": 0.0342924627949037,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.01534540948555798,\n \"mc2\": 0.3807436553778993,\n\
\ \"mc2_stderr\": 0.013953151574890315\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5179180887372014,\n \"acc_stderr\": 0.014602005585490975,\n\
\ \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284732\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6110336586337383,\n\
\ \"acc_stderr\": 0.0048651932370240465,\n \"acc_norm\": 0.8196574387572196,\n\
\ \"acc_norm_stderr\": 0.0038368677087019898\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425082,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425082\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.02737987122994325,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.02737987122994325\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.02515826601686858,\n \
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.02515826601686858\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7651376146788991,\n\
\ \"acc_stderr\": 0.018175110510343564,\n \"acc_norm\": 0.7651376146788991,\n\
\ \"acc_norm_stderr\": 0.018175110510343564\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n \
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\
\ \"acc_stderr\": 0.01501688469853988,\n \"acc_norm\": 0.7713920817369093,\n\
\ \"acc_norm_stderr\": 0.01501688469853988\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531025,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531025\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47374301675977654,\n\
\ \"acc_stderr\": 0.016699427672784765,\n \"acc_norm\": 0.47374301675977654,\n\
\ \"acc_norm_stderr\": 0.016699427672784765\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387292,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387292\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.02677492989972233,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.02677492989972233\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n\
\ \"acc_stderr\": 0.012654565234622864,\n \"acc_norm\": 0.43285528031290743,\n\
\ \"acc_norm_stderr\": 0.012654565234622864\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5866013071895425,\n \"acc_stderr\": 0.019922115682786692,\n \
\ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.019922115682786692\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674269,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674269\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.01534540948555798,\n \"mc2\": 0.3807436553778993,\n\
\ \"mc2_stderr\": 0.013953151574890315\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-03-43.604689.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-03-43.604689.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-03-43.604689.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-03-43.604689.parquet'
- config_name: results
data_files:
- split: 2023_10_01T15_03_43.604689
path:
- results_2023-10-01T15-03-43.604689.parquet
- split: latest
path:
- results_2023-10-01T15-03-43.604689.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T15:03:43.604689](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-gate_up_down/blob/main/results_2023-10-01T15-03-43.604689.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5663422561116331,
"acc_stderr": 0.034310991382522825,
"acc_norm": 0.5704277982864568,
"acc_norm_stderr": 0.0342924627949037,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.01534540948555798,
"mc2": 0.3807436553778993,
"mc2_stderr": 0.013953151574890315
},
"harness|arc:challenge|25": {
"acc": 0.5179180887372014,
"acc_stderr": 0.014602005585490975,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.014537144444284732
},
"harness|hellaswag|10": {
"acc": 0.6110336586337383,
"acc_stderr": 0.0048651932370240465,
"acc_norm": 0.8196574387572196,
"acc_norm_stderr": 0.0038368677087019898
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425082,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425082
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.02737987122994325,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.02737987122994325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.02515826601686858,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.02515826601686858
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7651376146788991,
"acc_stderr": 0.018175110510343564,
"acc_norm": 0.7651376146788991,
"acc_norm_stderr": 0.018175110510343564
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7713920817369093,
"acc_stderr": 0.01501688469853988,
"acc_norm": 0.7713920817369093,
"acc_norm_stderr": 0.01501688469853988
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531025,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531025
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47374301675977654,
"acc_stderr": 0.016699427672784765,
"acc_norm": 0.47374301675977654,
"acc_norm_stderr": 0.016699427672784765
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387292,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387292
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.02677492989972233,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.02677492989972233
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622864,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622864
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.019922115682786692,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.019922115682786692
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674269,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674269
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.01534540948555798,
"mc2": 0.3807436553778993,
"mc2_stderr": 0.013953151574890315
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open-Platypus_2.5w | 2023-10-01T15:08:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-Open-Platypus_2.5w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-Open-Platypus_2.5w](https://huggingface.co/CHIH-HUNG/llama-2-13b-Open-Platypus_2.5w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open-Platypus_2.5w\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T15:07:10.980202](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open-Platypus_2.5w/blob/main/results_2023-10-01T15-07-10.980202.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5613314648319273,\n\
\ \"acc_stderr\": 0.034228367793821476,\n \"acc_norm\": 0.5656910512543237,\n\
\ \"acc_norm_stderr\": 0.03420695006341245,\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.01582614243950235,\n \"mc2\": 0.42450489033852573,\n\
\ \"mc2_stderr\": 0.014188784752136422\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5460750853242321,\n \"acc_stderr\": 0.014549221105171862,\n\
\ \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6169089822744473,\n\
\ \"acc_stderr\": 0.004851466623601452,\n \"acc_norm\": 0.824636526588329,\n\
\ \"acc_norm_stderr\": 0.0037950051512043157\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.029946498567699948,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.029946498567699948\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842509,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842509\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572267,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572267\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860695,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860695\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736232,\n\
\ \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736232\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7596330275229358,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.7596330275229358,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145638,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145638\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929187,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929187\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.041032038305145124,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.041032038305145124\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291518,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291518\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700917,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700917\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398677,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398677\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48156424581005586,\n\
\ \"acc_stderr\": 0.016711130497782823,\n \"acc_norm\": 0.48156424581005586,\n\
\ \"acc_norm_stderr\": 0.016711130497782823\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02782610930728369,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02782610930728369\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700188,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700188\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n\
\ \"acc_stderr\": 0.012596744108998557,\n \"acc_norm\": 0.4178617992177314,\n\
\ \"acc_norm_stderr\": 0.012596744108998557\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.01582614243950235,\n \"mc2\": 0.42450489033852573,\n\
\ \"mc2_stderr\": 0.014188784752136422\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-Open-Platypus_2.5w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-07-10.980202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-07-10.980202.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-07-10.980202.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-07-10.980202.parquet'
- config_name: results
data_files:
- split: 2023_10_01T15_07_10.980202
path:
- results_2023-10-01T15-07-10.980202.parquet
- split: latest
path:
- results_2023-10-01T15-07-10.980202.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-Open-Platypus_2.5w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-Open-Platypus_2.5w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-Open-Platypus_2.5w](https://huggingface.co/CHIH-HUNG/llama-2-13b-Open-Platypus_2.5w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open-Platypus_2.5w",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T15:07:10.980202](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open-Platypus_2.5w/blob/main/results_2023-10-01T15-07-10.980202.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5613314648319273,
"acc_stderr": 0.034228367793821476,
"acc_norm": 0.5656910512543237,
"acc_norm_stderr": 0.03420695006341245,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.01582614243950235,
"mc2": 0.42450489033852573,
"mc2_stderr": 0.014188784752136422
},
"harness|arc:challenge|25": {
"acc": 0.5460750853242321,
"acc_stderr": 0.014549221105171862,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.6169089822744473,
"acc_stderr": 0.004851466623601452,
"acc_norm": 0.824636526588329,
"acc_norm_stderr": 0.0037950051512043157
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.029946498567699948,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.029946498567699948
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842509,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842509
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572267,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572267
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860695,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860695
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5230769230769231,
"acc_stderr": 0.025323990861736232,
"acc_norm": 0.5230769230769231,
"acc_norm_stderr": 0.025323990861736232
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5504201680672269,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.5504201680672269,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7596330275229358,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.7596330275229358,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145638,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145638
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929187,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929187
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.041032038305145124,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.041032038305145124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291518,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291518
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700917,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700917
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398677,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398677
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48156424581005586,
"acc_stderr": 0.016711130497782823,
"acc_norm": 0.48156424581005586,
"acc_norm_stderr": 0.016711130497782823
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02782610930728369,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02782610930728369
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700188,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700188
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.012596744108998557,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.012596744108998557
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.01582614243950235,
"mc2": 0.42450489033852573,
"mc2_stderr": 0.014188784752136422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_zarakiquemparte__zararp-1.1-l2-7b | 2023-10-01T15:08:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of zarakiquemparte/zararp-1.1-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/zararp-1.1-l2-7b](https://huggingface.co/zarakiquemparte/zararp-1.1-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zararp-1.1-l2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T15:07:29.187841](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zararp-1.1-l2-7b/blob/main/results_2023-10-01T15-07-29.187841.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5167187993951239,\n\
\ \"acc_stderr\": 0.0350473748137711,\n \"acc_norm\": 0.5203563763183874,\n\
\ \"acc_norm_stderr\": 0.035032234806834255,\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.519856175686671,\n\
\ \"mc2_stderr\": 0.015417587330124145\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.537542662116041,\n \"acc_stderr\": 0.01457014449507558,\n\
\ \"acc_norm\": 0.5648464163822525,\n \"acc_norm_stderr\": 0.014487986197186045\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6011750647281418,\n\
\ \"acc_stderr\": 0.004886559008754984,\n \"acc_norm\": 0.7884883489344752,\n\
\ \"acc_norm_stderr\": 0.004075456897370668\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.03053333843046752,\n\
\ \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.03053333843046752\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982026,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982026\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5774193548387097,\n \"acc_stderr\": 0.02810096472427264,\n \"\
acc_norm\": 0.5774193548387097,\n \"acc_norm_stderr\": 0.02810096472427264\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n \"\
acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03358618145732522,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03358618145732522\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845443,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.03247734334448111,\n \
\ \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.03247734334448111\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7137614678899082,\n \"acc_stderr\": 0.01937943662891999,\n \"\
acc_norm\": 0.7137614678899082,\n \"acc_norm_stderr\": 0.01937943662891999\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696043,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598028,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598028\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\
\ \"acc_stderr\": 0.028120966503914404,\n \"acc_norm\": 0.7564102564102564,\n\
\ \"acc_norm_stderr\": 0.028120966503914404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7126436781609196,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.7126436781609196,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.026817718130348923,\n\
\ \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.026817718130348923\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2748603351955307,\n\
\ \"acc_stderr\": 0.014931316703220504,\n \"acc_norm\": 0.2748603351955307,\n\
\ \"acc_norm_stderr\": 0.014931316703220504\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.028541722692618874,\n\
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.028541722692618874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.027701228468542595,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.027701228468542595\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37157757496740546,\n\
\ \"acc_stderr\": 0.01234182851452829,\n \"acc_norm\": 0.37157757496740546,\n\
\ \"acc_norm_stderr\": 0.01234182851452829\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150124,\n \
\ \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150124\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.03368787466115459,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.03368787466115459\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.519856175686671,\n\
\ \"mc2_stderr\": 0.015417587330124145\n }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/zararp-1.1-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-07-29.187841.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-07-29.187841.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-07-29.187841.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-07-29.187841.parquet'
- config_name: results
data_files:
- split: 2023_10_01T15_07_29.187841
path:
- results_2023-10-01T15-07-29.187841.parquet
- split: latest
path:
- results_2023-10-01T15-07-29.187841.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/zararp-1.1-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/zararp-1.1-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/zararp-1.1-l2-7b](https://huggingface.co/zarakiquemparte/zararp-1.1-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zararp-1.1-l2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T15:07:29.187841](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zararp-1.1-l2-7b/blob/main/results_2023-10-01T15-07-29.187841.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5167187993951239,
"acc_stderr": 0.0350473748137711,
"acc_norm": 0.5203563763183874,
"acc_norm_stderr": 0.035032234806834255,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.519856175686671,
"mc2_stderr": 0.015417587330124145
},
"harness|arc:challenge|25": {
"acc": 0.537542662116041,
"acc_stderr": 0.01457014449507558,
"acc_norm": 0.5648464163822525,
"acc_norm_stderr": 0.014487986197186045
},
"harness|hellaswag|10": {
"acc": 0.6011750647281418,
"acc_stderr": 0.004886559008754984,
"acc_norm": 0.7884883489344752,
"acc_norm_stderr": 0.004075456897370668
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5622641509433962,
"acc_stderr": 0.03053333843046752,
"acc_norm": 0.5622641509433962,
"acc_norm_stderr": 0.03053333843046752
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.023456037383982026,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.023456037383982026
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5774193548387097,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.5774193548387097,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03358618145732522,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03358618145732522
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845443,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4957983193277311,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.4957983193277311,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7137614678899082,
"acc_stderr": 0.01937943662891999,
"acc_norm": 0.7137614678899082,
"acc_norm_stderr": 0.01937943662891999
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598028,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.028120966503914404,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.028120966503914404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7126436781609196,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.7126436781609196,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.026817718130348923,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.026817718130348923
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2748603351955307,
"acc_stderr": 0.014931316703220504,
"acc_norm": 0.2748603351955307,
"acc_norm_stderr": 0.014931316703220504
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.027701228468542595,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.027701228468542595
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37157757496740546,
"acc_stderr": 0.01234182851452829,
"acc_norm": 0.37157757496740546,
"acc_norm_stderr": 0.01234182851452829
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49836601307189543,
"acc_stderr": 0.020227726838150124,
"acc_norm": 0.49836601307189543,
"acc_norm_stderr": 0.020227726838150124
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.03368787466115459,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.03368787466115459
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.519856175686671,
"mc2_stderr": 0.015417587330124145
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-v1.2 | 2023-10-01T15:15:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of The-Face-Of-Goonery/Huginn-13b-v1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [The-Face-Of-Goonery/Huginn-13b-v1.2](https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-v1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-v1.2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T15:14:32.857053](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-v1.2/blob/main/results_2023-10-01T15-14-32.857053.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5553314222182426,\n\
\ \"acc_stderr\": 0.034457143336673,\n \"acc_norm\": 0.5590558069902605,\n\
\ \"acc_norm_stderr\": 0.03443619760576142,\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.01686294168408838,\n \"mc2\": 0.5197363921890529,\n\
\ \"mc2_stderr\": 0.015737419947776412\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.01440982551840308,\n\
\ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.01425856388051378\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.642302330213105,\n\
\ \"acc_stderr\": 0.004783428874273592,\n \"acc_norm\": 0.8355905198167696,\n\
\ \"acc_norm_stderr\": 0.0036988923883801003\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5471698113207547,\n \"acc_stderr\": 0.030635627957961823,\n\
\ \"acc_norm\": 0.5471698113207547,\n \"acc_norm_stderr\": 0.030635627957961823\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523864,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943255,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178277,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178277\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7192660550458716,\n \"acc_stderr\": 0.01926605504587161,\n \"\
acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.01926605504587161\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114968,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114968\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890488,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890488\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101072,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101072\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.016598022120580418,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.016598022120580418\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534423,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534423\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.019997973035458333,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.019997973035458333\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287247,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287247\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.01686294168408838,\n \"mc2\": 0.5197363921890529,\n\
\ \"mc2_stderr\": 0.015737419947776412\n }\n}\n```"
repo_url: https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-v1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-14-32.857053.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-14-32.857053.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-14-32.857053.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-14-32.857053.parquet'
- config_name: results
data_files:
- split: 2023_10_01T15_14_32.857053
path:
- results_2023-10-01T15-14-32.857053.parquet
- split: latest
path:
- results_2023-10-01T15-14-32.857053.parquet
---
# Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-13b-v1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-v1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [The-Face-Of-Goonery/Huginn-13b-v1.2](https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-v1.2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T15:14:32.857053](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-v1.2/blob/main/results_2023-10-01T15-14-32.857053.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5553314222182426,
"acc_stderr": 0.034457143336673,
"acc_norm": 0.5590558069902605,
"acc_norm_stderr": 0.03443619760576142,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.01686294168408838,
"mc2": 0.5197363921890529,
"mc2_stderr": 0.015737419947776412
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.01440982551840308,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.01425856388051378
},
"harness|hellaswag|10": {
"acc": 0.642302330213105,
"acc_stderr": 0.004783428874273592,
"acc_norm": 0.8355905198167696,
"acc_norm_stderr": 0.0036988923883801003
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5471698113207547,
"acc_stderr": 0.030635627957961823,
"acc_norm": 0.5471698113207547,
"acc_norm_stderr": 0.030635627957961823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237656,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237656
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943255,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178277,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178277
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7192660550458716,
"acc_stderr": 0.01926605504587161,
"acc_norm": 0.7192660550458716,
"acc_norm_stderr": 0.01926605504587161
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890488,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890488
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101072,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101072
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580418,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580418
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534423,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.019997973035458333,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.019997973035458333
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287247,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287247
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512699,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512699
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.01686294168408838,
"mc2": 0.5197363921890529,
"mc2_stderr": 0.015737419947776412
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/engagekiss | 2023-10-01T16:14:09.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Engage Kiss
This is the image base of bangumi Engage Kiss, we detected 16 characters, 1252 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 176 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 166 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 64 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 34 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 324 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 57 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 30 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 85 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 44 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 15 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 24 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 14 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 80 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 28 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 10 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 101 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
movie13/Gemas | 2023-10-01T17:20:07.000Z | [
"region:us"
] | movie13 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_chargoddard__storytime-13b | 2023-10-01T15:29:51.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/storytime-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/storytime-13b](https://huggingface.co/chargoddard/storytime-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__storytime-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T15:28:27.861711](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__storytime-13b/blob/main/results_2023-10-01T15-28-27.861711.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5763370231419958,\n\
\ \"acc_stderr\": 0.03432066244201693,\n \"acc_norm\": 0.5800445296501794,\n\
\ \"acc_norm_stderr\": 0.034299045087695934,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.01683886288396583,\n \"mc2\": 0.5250015513015516,\n\
\ \"mc2_stderr\": 0.015881132202437784\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5938566552901023,\n \"acc_stderr\": 0.014351656690097862,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6472814180442144,\n\
\ \"acc_stderr\": 0.004768395354146807,\n \"acc_norm\": 0.839573790081657,\n\
\ \"acc_norm_stderr\": 0.0036625082723308984\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920938,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6451612903225806,\n \"acc_stderr\": 0.02721888977330877,\n \"\
acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.02721888977330877\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713546,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713546\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799605,\n\
\ \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799605\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7688073394495413,\n \"acc_stderr\": 0.018075750241633142,\n \"\
acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.018075750241633142\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069432,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069432\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.01483620516733356,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.01483620516733356\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546672,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546672\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4893854748603352,\n\
\ \"acc_stderr\": 0.0167187329411921,\n \"acc_norm\": 0.4893854748603352,\n\
\ \"acc_norm_stderr\": 0.0167187329411921\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829028,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829028\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719978,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719978\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n\
\ \"acc_stderr\": 0.012718456618701763,\n \"acc_norm\": 0.455019556714472,\n\
\ \"acc_norm_stderr\": 0.012718456618701763\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5866013071895425,\n \"acc_stderr\": 0.019922115682786682,\n \
\ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.019922115682786682\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.01683886288396583,\n \"mc2\": 0.5250015513015516,\n\
\ \"mc2_stderr\": 0.015881132202437784\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/storytime-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-28-27.861711.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-28-27.861711.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-28-27.861711.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-28-27.861711.parquet'
- config_name: results
data_files:
- split: 2023_10_01T15_28_27.861711
path:
- results_2023-10-01T15-28-27.861711.parquet
- split: latest
path:
- results_2023-10-01T15-28-27.861711.parquet
---
# Dataset Card for Evaluation run of chargoddard/storytime-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/storytime-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/storytime-13b](https://huggingface.co/chargoddard/storytime-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__storytime-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T15:28:27.861711](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__storytime-13b/blob/main/results_2023-10-01T15-28-27.861711.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5763370231419958,
"acc_stderr": 0.03432066244201693,
"acc_norm": 0.5800445296501794,
"acc_norm_stderr": 0.034299045087695934,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.01683886288396583,
"mc2": 0.5250015513015516,
"mc2_stderr": 0.015881132202437784
},
"harness|arc:challenge|25": {
"acc": 0.5938566552901023,
"acc_stderr": 0.014351656690097862,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6472814180442144,
"acc_stderr": 0.004768395354146807,
"acc_norm": 0.839573790081657,
"acc_norm_stderr": 0.0036625082723308984
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009794,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009794
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087764,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087764
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920938,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330877,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330877
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713546,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713546
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.018075750241633142,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.018075750241633142
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069432,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069432
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.01483620516733356,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.01483620516733356
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546672,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546672
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4893854748603352,
"acc_stderr": 0.0167187329411921,
"acc_norm": 0.4893854748603352,
"acc_norm_stderr": 0.0167187329411921
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829028,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829028
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719978,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719978
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701763,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701763
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.019922115682786682,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.019922115682786682
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.01683886288396583,
"mc2": 0.5250015513015516,
"mc2_stderr": 0.015881132202437784
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_khoantap__wizard-limarp | 2023-10-01T15:41:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of khoantap/wizard-limarp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [khoantap/wizard-limarp](https://huggingface.co/khoantap/wizard-limarp) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_khoantap__wizard-limarp\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T15:39:54.493965](https://huggingface.co/datasets/open-llm-leaderboard/details_khoantap__wizard-limarp/blob/main/results_2023-10-01T15-39-54.493965.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5508016266489828,\n\
\ \"acc_stderr\": 0.03448632181800869,\n \"acc_norm\": 0.5547870513407215,\n\
\ \"acc_norm_stderr\": 0.03446689219489,\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150538,\n \"mc2\": 0.482777527677442,\n\
\ \"mc2_stderr\": 0.015184988472523642\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5452218430034129,\n \"acc_stderr\": 0.014551507060836357,\n\
\ \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.014392730009221005\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6244771957777335,\n\
\ \"acc_stderr\": 0.004832679188788789,\n \"acc_norm\": 0.8186616211909978,\n\
\ \"acc_norm_stderr\": 0.003845108476401298\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.030052580579557845,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.030052580579557845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171451,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171451\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\"\
: 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.027327548447957532,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.027327548447957532\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.029252823291803638,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.029252823291803638\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.728440366972477,\n \"acc_stderr\": 0.01906909836319143,\n \"acc_norm\"\
: 0.728440366972477,\n \"acc_norm_stderr\": 0.01906909836319143\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.03324708911809117,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.03324708911809117\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n \
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\"\
: 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598028,\n \"\
acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598028\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285712,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285712\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.02559819368665224,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.02559819368665224\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7279693486590039,\n\
\ \"acc_stderr\": 0.015913367447500503,\n \"acc_norm\": 0.7279693486590039,\n\
\ \"acc_norm_stderr\": 0.015913367447500503\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931505,\n\
\ \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931505\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30726256983240224,\n\
\ \"acc_stderr\": 0.015430158846469609,\n \"acc_norm\": 0.30726256983240224,\n\
\ \"acc_norm_stderr\": 0.015430158846469609\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.027487472980871595,\n\
\ \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.027487472980871595\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41460234680573665,\n\
\ \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.41460234680573665,\n\
\ \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329387,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329387\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02017548876548404,\n \
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02017548876548404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150538,\n \"mc2\": 0.482777527677442,\n\
\ \"mc2_stderr\": 0.015184988472523642\n }\n}\n```"
repo_url: https://huggingface.co/khoantap/wizard-limarp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-39-54.493965.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-39-54.493965.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-39-54.493965.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-39-54.493965.parquet'
- config_name: results
data_files:
- split: 2023_10_01T15_39_54.493965
path:
- results_2023-10-01T15-39-54.493965.parquet
- split: latest
path:
- results_2023-10-01T15-39-54.493965.parquet
---
# Dataset Card for Evaluation run of khoantap/wizard-limarp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/khoantap/wizard-limarp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [khoantap/wizard-limarp](https://huggingface.co/khoantap/wizard-limarp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_khoantap__wizard-limarp",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T15:39:54.493965](https://huggingface.co/datasets/open-llm-leaderboard/details_khoantap__wizard-limarp/blob/main/results_2023-10-01T15-39-54.493965.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5508016266489828,
"acc_stderr": 0.03448632181800869,
"acc_norm": 0.5547870513407215,
"acc_norm_stderr": 0.03446689219489,
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150538,
"mc2": 0.482777527677442,
"mc2_stderr": 0.015184988472523642
},
"harness|arc:challenge|25": {
"acc": 0.5452218430034129,
"acc_stderr": 0.014551507060836357,
"acc_norm": 0.5861774744027304,
"acc_norm_stderr": 0.014392730009221005
},
"harness|hellaswag|10": {
"acc": 0.6244771957777335,
"acc_stderr": 0.004832679188788789,
"acc_norm": 0.8186616211909978,
"acc_norm_stderr": 0.003845108476401298
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.030052580579557845,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.030052580579557845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171451,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171451
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602842,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957532,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957532
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.029252823291803638,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.029252823291803638
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.728440366972477,
"acc_stderr": 0.01906909836319143,
"acc_norm": 0.728440366972477,
"acc_norm_stderr": 0.01906909836319143
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.03324708911809117,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.03324708911809117
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598028,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285712,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285712
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012351,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012351
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665224,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665224
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7279693486590039,
"acc_stderr": 0.015913367447500503,
"acc_norm": 0.7279693486590039,
"acc_norm_stderr": 0.015913367447500503
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5924855491329479,
"acc_stderr": 0.026454578146931505,
"acc_norm": 0.5924855491329479,
"acc_norm_stderr": 0.026454578146931505
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30726256983240224,
"acc_stderr": 0.015430158846469609,
"acc_norm": 0.30726256983240224,
"acc_norm_stderr": 0.015430158846469609
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891776,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891776
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.02777091853142784,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.02777091853142784
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.027487472980871595,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.027487472980871595
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41460234680573665,
"acc_stderr": 0.012582597058908284,
"acc_norm": 0.41460234680573665,
"acc_norm_stderr": 0.012582597058908284
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329387,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329387
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02017548876548404,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02017548876548404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150538,
"mc2": 0.482777527677442,
"mc2_stderr": 0.015184988472523642
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_AlekseyKorshuk__vic15-exp-syn-fight-cp3838 | 2023-10-01T15:45:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of AlekseyKorshuk/vic15-exp-syn-fight-cp3838
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AlekseyKorshuk/vic15-exp-syn-fight-cp3838](https://huggingface.co/AlekseyKorshuk/vic15-exp-syn-fight-cp3838)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AlekseyKorshuk__vic15-exp-syn-fight-cp3838\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T15:44:18.303081](https://huggingface.co/datasets/open-llm-leaderboard/details_AlekseyKorshuk__vic15-exp-syn-fight-cp3838/blob/main/results_2023-10-01T15-44-18.303081.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5035250204495572,\n\
\ \"acc_stderr\": 0.03500483713757127,\n \"acc_norm\": 0.5069246512960048,\n\
\ \"acc_norm_stderr\": 0.034993617027730566,\n \"mc1\": 0.3329253365973072,\n\
\ \"mc1_stderr\": 0.016497402382012052,\n \"mc2\": 0.49613197888405214,\n\
\ \"mc2_stderr\": 0.015701759057597957\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49658703071672355,\n \"acc_stderr\": 0.014611050403244081,\n\
\ \"acc_norm\": 0.5179180887372014,\n \"acc_norm_stderr\": 0.014602005585490978\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5786695877315275,\n\
\ \"acc_stderr\": 0.004927631806477561,\n \"acc_norm\": 0.7579167496514638,\n\
\ \"acc_norm_stderr\": 0.0042746901436291375\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723456,\n\
\ \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723456\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.038073017265045105,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.038073017265045105\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5451612903225806,\n\
\ \"acc_stderr\": 0.028327743091561074,\n \"acc_norm\": 0.5451612903225806,\n\
\ \"acc_norm_stderr\": 0.028327743091561074\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.037694303145125674,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.037694303145125674\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016339,\n \"\
acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016339\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017838,\n\
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017838\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959316,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959316\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804723,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804723\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6990825688073394,\n \"acc_stderr\": 0.019664751366802114,\n \"\
acc_norm\": 0.6990825688073394,\n \"acc_norm_stderr\": 0.019664751366802114\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\"\
: 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \"\
acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097174,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097174\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196694,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196694\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n\
\ \"acc_stderr\": 0.016617501738763394,\n \"acc_norm\": 0.6845466155810983,\n\
\ \"acc_norm_stderr\": 0.016617501738763394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5578034682080925,\n \"acc_stderr\": 0.026738603643807403,\n\
\ \"acc_norm\": 0.5578034682080925,\n \"acc_norm_stderr\": 0.026738603643807403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n\
\ \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n\
\ \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.028472938478033526,\n\
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.028472938478033526\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946205,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946205\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.027716661650194038,\n\
\ \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.027716661650194038\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3670143415906128,\n\
\ \"acc_stderr\": 0.012310264244842125,\n \"acc_norm\": 0.3670143415906128,\n\
\ \"acc_norm_stderr\": 0.012310264244842125\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47549019607843135,\n \"acc_stderr\": 0.020203517280261436,\n \
\ \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.020203517280261436\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.032941184790540944,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.032941184790540944\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3329253365973072,\n\
\ \"mc1_stderr\": 0.016497402382012052,\n \"mc2\": 0.49613197888405214,\n\
\ \"mc2_stderr\": 0.015701759057597957\n }\n}\n```"
repo_url: https://huggingface.co/AlekseyKorshuk/vic15-exp-syn-fight-cp3838
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-44-18.303081.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T15-44-18.303081.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-44-18.303081.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T15-44-18.303081.parquet'
- config_name: results
data_files:
- split: 2023_10_01T15_44_18.303081
path:
- results_2023-10-01T15-44-18.303081.parquet
- split: latest
path:
- results_2023-10-01T15-44-18.303081.parquet
---
# Dataset Card for Evaluation run of AlekseyKorshuk/vic15-exp-syn-fight-cp3838
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AlekseyKorshuk/vic15-exp-syn-fight-cp3838
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AlekseyKorshuk/vic15-exp-syn-fight-cp3838](https://huggingface.co/AlekseyKorshuk/vic15-exp-syn-fight-cp3838) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AlekseyKorshuk__vic15-exp-syn-fight-cp3838",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T15:44:18.303081](https://huggingface.co/datasets/open-llm-leaderboard/details_AlekseyKorshuk__vic15-exp-syn-fight-cp3838/blob/main/results_2023-10-01T15-44-18.303081.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5035250204495572,
"acc_stderr": 0.03500483713757127,
"acc_norm": 0.5069246512960048,
"acc_norm_stderr": 0.034993617027730566,
"mc1": 0.3329253365973072,
"mc1_stderr": 0.016497402382012052,
"mc2": 0.49613197888405214,
"mc2_stderr": 0.015701759057597957
},
"harness|arc:challenge|25": {
"acc": 0.49658703071672355,
"acc_stderr": 0.014611050403244081,
"acc_norm": 0.5179180887372014,
"acc_norm_stderr": 0.014602005585490978
},
"harness|hellaswag|10": {
"acc": 0.5786695877315275,
"acc_stderr": 0.004927631806477561,
"acc_norm": 0.7579167496514638,
"acc_norm_stderr": 0.0042746901436291375
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.030746349975723456,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.030746349975723456
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.038073017265045105,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.038073017265045105
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5451612903225806,
"acc_stderr": 0.028327743091561074,
"acc_norm": 0.5451612903225806,
"acc_norm_stderr": 0.028327743091561074
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.037694303145125674,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.037694303145125674
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.03464881675016339,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.03464881675016339
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017838,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017838
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.025787874220959316,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.025787874220959316
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804723,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804723
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6990825688073394,
"acc_stderr": 0.019664751366802114,
"acc_norm": 0.6990825688073394,
"acc_norm_stderr": 0.019664751366802114
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.04721188506097174,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.04721188506097174
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196694,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196694
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.016617501738763394,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.016617501738763394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5578034682080925,
"acc_stderr": 0.026738603643807403,
"acc_norm": 0.5578034682080925,
"acc_norm_stderr": 0.026738603643807403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.028472938478033526,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.028472938478033526
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946205,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.027716661650194038,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.027716661650194038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3670143415906128,
"acc_stderr": 0.012310264244842125,
"acc_norm": 0.3670143415906128,
"acc_norm_stderr": 0.012310264244842125
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.020203517280261436,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.020203517280261436
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.032941184790540944,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.032941184790540944
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3329253365973072,
"mc1_stderr": 0.016497402382012052,
"mc2": 0.49613197888405214,
"mc2_stderr": 0.015701759057597957
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pphuc25/data-gpt4 | 2023-10-01T15:47:44.000Z | [
"region:us"
] | pphuc25 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: result
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 224096
num_examples: 100
download_size: 119152
dataset_size: 224096
---
# Dataset Card for "data-gpt4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
1aurent/COMPTECH2022 | 2023-10-01T15:53:30.000Z | [
"task_categories:image-classification",
"size_categories:1K<n<10K",
"license:cc-by-4.0",
"region:us"
] | 1aurent | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': forged
'1': genuine
splits:
- name: train
num_bytes: 10544187.713
num_examples: 6171
download_size: 2251347922
dataset_size: 10544187.713
license: cc-by-4.0
task_categories:
- image-classification
size_categories:
- 1K<n<10K
---
# COMPTECH2022 "WhoSigned?"
https://www.kaggle.com/datasets/tienen/handwritten-signature-verification
## Description
Dataset is containing over 5000 handwritten signatures with correspondent images and crops for real and forged signatures
Each image contains about 10 handwritten signatures from the same user id.
The image is then cropped with help of a segmentation neural network.
Every crop contains one handwritten signature.
You can define id from the image filename.
Created with COMPTECH2022 support by Toloka.ai
|
pphuc25/data-gpt3 | 2023-10-01T15:49:24.000Z | [
"region:us"
] | pphuc25 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: result
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 260620
num_examples: 100
download_size: 84562
dataset_size: 260620
---
# Dataset Card for "data-gpt3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sitloboi2012/CMDS_Multimodal_Document | 2023-10-01T16:03:24.000Z | [
"task_categories:image-classification",
"task_categories:text-classification",
"task_categories:image-to-text",
"size_categories:n<1K",
"language:bg",
"license:apache-2.0",
"DocumentAI",
"ImageClassification",
"SequenceClassification",
"region:us"
] | sitloboi2012 | null | null | null | 0 | 0 | ---
license: apache-2.0
task_categories:
- image-classification
- text-classification
- image-to-text
language:
- bg
tags:
- DocumentAI
- ImageClassification
- SequenceClassification
pretty_name: CMDS Document Images Dataset
size_categories:
- n<1K
---
# Dataset Card for Cyrillic Multimodel Document (CMDS)
This is the dataset consists of 3789 pairs of images and text across 31 categories downloaded from the Bulgarian ministry of finance
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
Uses this dataset for downstream task like Document Classification, Image Classification or Text Classification (Sequences Classification). Suitable for multimodal Model like LayoutLm Family, Donut, etc.
### Languages
Bulgarian
### Data Fields
- __text__ (bytes): the text appear in the document
- __filename__ (str): the name of the file
- __image__ (PIL.Image): the image of the document
- __label__ (str): the label of the document. There are 31 differences labels |
semeru/Code-code-galeras-prompting-3k-control | 2023-10-01T16:10:17.000Z | [
"region:us"
] | semeru | null | null | null | 0 | 0 | Entry not found |
semeru/Code-code-galeras-prompting-3k-treatment-1 | 2023-10-01T16:12:40.000Z | [
"region:us"
] | semeru | null | null | null | 0 | 0 | Entry not found |
semeru/Code-code-galeras-prompting-3k-treatment-2 | 2023-10-01T16:14:18.000Z | [
"region:us"
] | semeru | null | null | null | 0 | 0 | Entry not found |
gjoy/files_for_GJ | 2023-10-01T16:16:11.000Z | [
"region:us"
] | gjoy | null | null | null | 0 | 0 | Entry not found |
jjv360/test-dataset | 2023-10-01T16:16:25.000Z | [
"region:us"
] | jjv360 | null | null | null | 0 | 0 | Entry not found |
LAHASH/weatherandnews | 2023-10-01T16:16:49.000Z | [
"license:unknown",
"region:us"
] | LAHASH | null | null | null | 0 | 0 | ---
license: unknown
---
|
shawarmas/yes | 2023-10-01T16:31:45.000Z | [
"region:us"
] | shawarmas | null | null | null | 0 | 0 | Entry not found |
HK83/two-people | 2023-10-01T16:42:28.000Z | [
"region:us"
] | HK83 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_NoIdeaLand__test-4k-fn | 2023-10-01T16:33:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NoIdeaLand/test-4k-fn
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NoIdeaLand/test-4k-fn](https://huggingface.co/NoIdeaLand/test-4k-fn) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NoIdeaLand__test-4k-fn\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T16:31:47.992543](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-4k-fn/blob/main/results_2023-10-01T16-31-47.992543.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2795859427889157,\n\
\ \"acc_stderr\": 0.03244654146727709,\n \"acc_norm\": 0.283431508310814,\n\
\ \"acc_norm_stderr\": 0.032446107426975616,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080512,\n \"mc2\": 0.38860179255046867,\n\
\ \"mc2_stderr\": 0.014093255696402213\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35665529010238906,\n \"acc_stderr\": 0.01399805690262019,\n\
\ \"acc_norm\": 0.3993174061433447,\n \"acc_norm_stderr\": 0.014312094557946704\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4971121290579566,\n\
\ \"acc_stderr\": 0.004989698183207823,\n \"acc_norm\": 0.6813383788090022,\n\
\ \"acc_norm_stderr\": 0.004650052150094427\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.29056603773584905,\n \"acc_stderr\": 0.027943219989337145,\n\
\ \"acc_norm\": 0.29056603773584905,\n \"acc_norm_stderr\": 0.027943219989337145\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.038552896163789485,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.038552896163789485\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2019704433497537,\n \"acc_stderr\": 0.02824735012218027,\n\
\ \"acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.02824735012218027\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404288,\n\
\ \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404288\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012397,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012397\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3037974683544304,\n \"acc_stderr\": 0.029936696387138605,\n \
\ \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.029936696387138605\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n\
\ \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.3542600896860987,\n\
\ \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3803418803418803,\n\
\ \"acc_stderr\": 0.03180425204384099,\n \"acc_norm\": 0.3803418803418803,\n\
\ \"acc_norm_stderr\": 0.03180425204384099\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24521072796934865,\n\
\ \"acc_stderr\": 0.015384352284543936,\n \"acc_norm\": 0.24521072796934865,\n\
\ \"acc_norm_stderr\": 0.015384352284543936\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.02678745311190654,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.02678745311190654\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\
\ \"acc_stderr\": 0.024619771956697165,\n \"acc_norm\": 0.2508038585209003,\n\
\ \"acc_norm_stderr\": 0.024619771956697165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2777053455019557,\n\
\ \"acc_stderr\": 0.011438741422769575,\n \"acc_norm\": 0.2777053455019557,\n\
\ \"acc_norm_stderr\": 0.011438741422769575\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487428,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487428\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28104575163398693,\n \"acc_stderr\": 0.018185218954318075,\n \
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.018185218954318075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511114,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511114\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080512,\n \"mc2\": 0.38860179255046867,\n\
\ \"mc2_stderr\": 0.014093255696402213\n }\n}\n```"
repo_url: https://huggingface.co/NoIdeaLand/test-4k-fn
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|arc:challenge|25_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hellaswag|10_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T16-31-47.992543.parquet'
- config_name: results
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- results_2023-10-01T16-31-47.992543.parquet
- split: latest
path:
- results_2023-10-01T16-31-47.992543.parquet
---
# Dataset Card for Evaluation run of NoIdeaLand/test-4k-fn
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NoIdeaLand/test-4k-fn
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NoIdeaLand/test-4k-fn](https://huggingface.co/NoIdeaLand/test-4k-fn) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NoIdeaLand__test-4k-fn",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T16:31:47.992543](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-4k-fn/blob/main/results_2023-10-01T16-31-47.992543.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2795859427889157,
"acc_stderr": 0.03244654146727709,
"acc_norm": 0.283431508310814,
"acc_norm_stderr": 0.032446107426975616,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080512,
"mc2": 0.38860179255046867,
"mc2_stderr": 0.014093255696402213
},
"harness|arc:challenge|25": {
"acc": 0.35665529010238906,
"acc_stderr": 0.01399805690262019,
"acc_norm": 0.3993174061433447,
"acc_norm_stderr": 0.014312094557946704
},
"harness|hellaswag|10": {
"acc": 0.4971121290579566,
"acc_stderr": 0.004989698183207823,
"acc_norm": 0.6813383788090022,
"acc_norm_stderr": 0.004650052150094427
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.29056603773584905,
"acc_stderr": 0.027943219989337145,
"acc_norm": 0.29056603773584905,
"acc_norm_stderr": 0.027943219989337145
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.038552896163789485,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.038552896163789485
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.02824735012218027,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.02824735012218027
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012397,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012397
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3037974683544304,
"acc_stderr": 0.029936696387138605,
"acc_norm": 0.3037974683544304,
"acc_norm_stderr": 0.029936696387138605
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3803418803418803,
"acc_stderr": 0.03180425204384099,
"acc_norm": 0.3803418803418803,
"acc_norm_stderr": 0.03180425204384099
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24521072796934865,
"acc_stderr": 0.015384352284543936,
"acc_norm": 0.24521072796934865,
"acc_norm_stderr": 0.015384352284543936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.02678745311190654,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.02678745311190654
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697165,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2777053455019557,
"acc_stderr": 0.011438741422769575,
"acc_norm": 0.2777053455019557,
"acc_norm_stderr": 0.011438741422769575
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.024060599423487428,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.024060599423487428
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.018185218954318075,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.018185218954318075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511114,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511114
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080512,
"mc2": 0.38860179255046867,
"mc2_stderr": 0.014093255696402213
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/orenoimoutogakonnanikawaiiwakeganai | 2023-10-01T18:44:56.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Ore No Imouto Ga Konna Ni Kawaii Wake Ga Nai
This is the image base of bangumi Ore no Imouto ga Konna ni Kawaii Wake ga Nai, we detected 40 characters, 4925 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1496 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 25 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 305 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 41 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 42 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 29 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 121 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 35 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 15 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 54 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 36 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 192 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 60 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 117 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 15 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 19 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 51 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 38 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 85 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 17 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 19 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 10 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 972 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 46 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 496 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 37 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 35 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 8 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 24 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 7 | [Download](29/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 30 | 8 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 18 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 50 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 11 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 51 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 52 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 14 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 7 | [Download](37/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 38 | 20 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 247 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
multi-train/emb_train_v1_2 | 2023-10-01T16:45:53.000Z | [
"region:us"
] | multi-train | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
sequence: string
- name: task
dtype: string
- name: instruction
struct:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
splits:
- name: train
num_bytes: 1271098814
num_examples: 600000
download_size: 446126873
dataset_size: 1271098814
---
# Dataset Card for "emb_train_v1_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kotiralla/mj8 | 2023-10-01T16:46:47.000Z | [
"license:lppl-1.3c",
"region:us"
] | Kotiralla | null | null | null | 0 | 0 | ---
license: lppl-1.3c
---
|
multi-train/emb_train_v1_3 | 2023-10-01T16:49:31.000Z | [
"region:us"
] | multi-train | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: query
dtype: string
- name: pos
sequence: string
- name: neg
sequence: string
- name: task
dtype: string
- name: instruction
struct:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
splits:
- name: train
num_bytes: 148713747
num_examples: 175000
download_size: 84421055
dataset_size: 148713747
---
# Dataset Card for "emb_train_v1_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rutson/TomorrowXTogether | 2023-10-04T14:04:50.000Z | [
"region:us"
] | Rutson | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_anhnv125__llama-op-v4 | 2023-10-01T16:51:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of anhnv125/llama-op-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [anhnv125/llama-op-v4](https://huggingface.co/anhnv125/llama-op-v4) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_anhnv125__llama-op-v4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T16:49:40.614622](https://huggingface.co/datasets/open-llm-leaderboard/details_anhnv125__llama-op-v4/blob/main/results_2023-10-01T16-49-40.614622.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5704304835934253,\n\
\ \"acc_stderr\": 0.03447222552265844,\n \"acc_norm\": 0.5746357491117303,\n\
\ \"acc_norm_stderr\": 0.03445324424541304,\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.4272385640595212,\n\
\ \"mc2_stderr\": 0.015005651426028057\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186045,\n\
\ \"acc_norm\": 0.6151877133105802,\n \"acc_norm_stderr\": 0.014218371065251098\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5943039235212109,\n\
\ \"acc_stderr\": 0.004900227226433392,\n \"acc_norm\": 0.7920732921728739,\n\
\ \"acc_norm_stderr\": 0.004049947000889829\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920935,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920935\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302844,\n \"\
acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302844\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915333,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915333\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.024838811988033165,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.024838811988033165\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n\
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7779816513761468,\n \"acc_stderr\": 0.01781884956479664,\n \"\
acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.01781884956479664\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n\
\ \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.0274210072953929,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.0274210072953929\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7420178799489144,\n\
\ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.7420178799489144,\n\
\ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48379888268156424,\n\
\ \"acc_stderr\": 0.01671372072950102,\n \"acc_norm\": 0.48379888268156424,\n\
\ \"acc_norm_stderr\": 0.01671372072950102\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283693,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971646,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971646\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.02712511551316685,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.02712511551316685\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223684,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223684\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.012671902782567648,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.012671902782567648\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767105,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573037,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573037\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.4272385640595212,\n\
\ \"mc2_stderr\": 0.015005651426028057\n }\n}\n```"
repo_url: https://huggingface.co/anhnv125/llama-op-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|arc:challenge|25_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hellaswag|10_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-49-40.614622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-49-40.614622.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T16-49-40.614622.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T16-49-40.614622.parquet'
- config_name: results
data_files:
- split: 2023_10_01T16_49_40.614622
path:
- results_2023-10-01T16-49-40.614622.parquet
- split: latest
path:
- results_2023-10-01T16-49-40.614622.parquet
---
# Dataset Card for Evaluation run of anhnv125/llama-op-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/anhnv125/llama-op-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [anhnv125/llama-op-v4](https://huggingface.co/anhnv125/llama-op-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_anhnv125__llama-op-v4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T16:49:40.614622](https://huggingface.co/datasets/open-llm-leaderboard/details_anhnv125__llama-op-v4/blob/main/results_2023-10-01T16-49-40.614622.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5704304835934253,
"acc_stderr": 0.03447222552265844,
"acc_norm": 0.5746357491117303,
"acc_norm_stderr": 0.03445324424541304,
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.4272385640595212,
"mc2_stderr": 0.015005651426028057
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186045,
"acc_norm": 0.6151877133105802,
"acc_norm_stderr": 0.014218371065251098
},
"harness|hellaswag|10": {
"acc": 0.5943039235212109,
"acc_stderr": 0.004900227226433392,
"acc_norm": 0.7920732921728739,
"acc_norm_stderr": 0.004049947000889829
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920935,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920935
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302844,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302844
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438803,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438803
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915333,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915333
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.024838811988033165,
"acc_norm": 0.6,
"acc_norm_stderr": 0.024838811988033165
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.01781884956479664,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.01781884956479664
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.0274210072953929,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.0274210072953929
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7420178799489144,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.7420178799489144,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48379888268156424,
"acc_stderr": 0.01671372072950102,
"acc_norm": 0.48379888268156424,
"acc_norm_stderr": 0.01671372072950102
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971646,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971646
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.02712511551316685,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.02712511551316685
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223684,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223684
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567648,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567648
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.030161911930767105,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.030161911930767105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573037,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573037
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.4272385640595212,
"mc2_stderr": 0.015005651426028057
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
PanoEvJ/GPT3.5_summarization_preference_RLAIF | 2023-10-01T16:59:26.000Z | [
"region:us"
] | PanoEvJ | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 162321
num_examples: 100
download_size: 105617
dataset_size: 162321
---
# Dataset Card for "GPT3.5_summarization_preference_RLAIF"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rutson/Audio | 2023-10-01T17:05:09.000Z | [
"region:us"
] | Rutson | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.