id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
WyldKnyght/pokemon_tcg_dataset | 2023-09-21T16:48:31.000Z | [
"license:mit",
"region:us"
] | WyldKnyght | null | null | null | 0 | 0 | ---
license: mit
---
|
dim/kinopoisk_raw | 2023-09-21T16:54:20.000Z | [
"region:us"
] | dim | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: content
dtype: string
- name: title
dtype: string
- name: grade3
dtype: string
- name: movie_name
dtype: string
- name: part
dtype: string
- name: review_id
dtype: string
- name: author
dtype: string
- name: date
dtype: string
- name: grade10
dtype: string
- name: Idx
dtype: int32
splits:
- name: train
num_bytes: 138684842
num_examples: 36591
download_size: 70387577
dataset_size: 138684842
---
# Dataset Card for "kinopoisk_raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HenriCastro/TN_ds_001 | 2023-09-21T17:00:30.000Z | [
"region:us"
] | HenriCastro | null | null | null | 0 | 0 | Entry not found |
AlekseyKorshuk/PIPPA-lmgym-old | 2023-09-21T18:49:50.000Z | [
"region:us"
] | AlekseyKorshuk | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 33744003688
num_examples: 415409
download_size: 0
dataset_size: 33744003688
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "PIPPA-lmgym"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
osore/regimeneutro | 2023-09-21T17:09:41.000Z | [
"region:us"
] | osore | null | null | null | 0 | 0 | Entry not found |
GabrielTOP/DanielDiaz | 2023-09-21T17:46:00.000Z | [
"region:us"
] | GabrielTOP | null | null | null | 0 | 0 | Entry not found |
dim/medical_qa_ru_data | 2023-09-21T17:44:32.000Z | [
"region:us"
] | dim | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: date
dtype: string
- name: categ
dtype: string
- name: theme
dtype: string
- name: desc
dtype: string
- name: ans
dtype: string
- name: spec10
dtype: string
splits:
- name: train
num_bytes: 268150120
num_examples: 190335
download_size: 132020030
dataset_size: 268150120
---
# Dataset Card for "medical_qa_ru_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oozora/nisargadatta_maharaj | 2023-09-21T17:46:57.000Z | [
"license:mit",
"region:us"
] | oozora | null | null | null | 0 | 0 | ---
license: mit
---
|
shubhamagarwal92/rw_2308 | 2023-09-21T20:48:09.000Z | [
"region:us"
] | shubhamagarwal92 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: aid
dtype: string
- name: mid
dtype: string
- name: abstract
dtype: string
- name: corpusid
dtype: int64
- name: text_except_rw
dtype: string
- name: title
dtype: string
- name: related_work
dtype: string
- name: original_related_work
dtype: string
- name: ref_abstract
struct:
- name: abstract
sequence: string
- name: cite_N
sequence: string
- name: corpursid
sequence: string
- name: ref_abstract_original
struct:
- name: abstract
sequence: string
- name: cite_N
sequence: string
- name: corpursid
sequence: string
- name: ref_abstract_full_text
struct:
- name: abstract
sequence: string
- name: all_para_text
sequence: string
- name: cite_N
sequence: string
- name: corpursid
sequence: string
- name: ref_abstract_full_text_original
struct:
- name: abstract
sequence: string
- name: all_para_text
sequence: string
- name: cite_N
sequence: string
- name: corpursid
sequence: string
- name: total_cites
dtype: int64
splits:
- name: test
num_bytes: 2701053361
num_examples: 6276
download_size: 1209591039
dataset_size: 2701053361
---
# Dataset Card for "rw_2308"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tohsaka_rin_fatestaynightufotable | 2023-09-21T18:23:46.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Tohsaka Rin
This is the dataset of Tohsaka Rin, containing 296 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 296 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 643 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 296 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 296 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 296 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 296 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 296 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 643 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 643 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 643 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
euclaise/goodreads_100k | 2023-09-21T18:25:56.000Z | [
"size_categories:10K<n<100K",
"license:cc0-1.0",
"region:us"
] | euclaise | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: author
dtype: string
- name: desc
dtype: string
- name: genre
dtype: string
- name: isbn
dtype: string
- name: link
dtype: string
- name: pages
dtype: int64
- name: rating
dtype: float64
- name: reviews
dtype: int64
- name: title
dtype: string
- name: totalratings
dtype: int64
splits:
- name: train
num_bytes: 111985794
num_examples: 100000
download_size: 69614148
dataset_size: 111985794
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc0-1.0
size_categories:
- 10K<n<100K
---
# Goodreads 100k
Clone of Manav Dhamani's [goodreads-books-100k](https://www.kaggle.com/datasets/mdhamani/goodreads-books-100k) dataset from Kaggle. |
Arkelaw/danielbroder | 2023-09-21T18:46:42.000Z | [
"region:us"
] | Arkelaw | null | null | null | 0 | 0 | Entry not found |
CyberHarem/saber_fatestaynightufotable | 2023-09-21T18:54:57.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Saber
This is the dataset of Saber, containing 294 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 294 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 675 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 294 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 294 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 294 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 294 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 294 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 675 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 675 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 675 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
sashrafi/test-repo | 2023-09-21T18:53:16.000Z | [
"region:us"
] | sashrafi | null | null | null | 0 | 0 | Entry not found |
justin933/test | 2023-09-21T18:58:58.000Z | [
"license:unknown",
"region:us"
] | justin933 | null | null | null | 0 | 0 | ---
license: unknown
---
|
cawoylel/OpenFulaSpeechCorpora | 2023-09-21T19:05:26.000Z | [
"region:us"
] | cawoylel | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: pulaar
path: data/pulaar-*
- split: liptako
path: data/liptako-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
- name: dialect
dtype: string
splits:
- name: pulaar
num_bytes: 3398551955.96
num_examples: 12880
- name: liptako
num_bytes: 490660761.51
num_examples: 10397
download_size: 3084439394
dataset_size: 3889212717.4700003
---
# Dataset Card for "OpenFulaSpeechCorpora"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/matou_sakura_fatestaynightufotable | 2023-09-21T19:15:07.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Matou Sakura
This is the dataset of Matou Sakura, containing 163 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 163 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 369 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 163 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 163 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 163 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 163 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 163 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 369 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 369 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 369 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
lilyhg/test | 2023-09-21T19:11:34.000Z | [
"license:apache-2.0",
"region:us"
] | lilyhg | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
CyberHarem/illyasviel_von_einzbern_fatestaynightufotable | 2023-09-21T19:26:17.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Illyasviel von Einzbern
This is the dataset of Illyasviel von Einzbern, containing 95 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 95 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 237 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 95 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 95 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 95 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 95 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 95 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 237 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 237 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 237 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
BangumiBase/akibameidosensou | 2023-09-29T10:03:29.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Akiba Meido Sensou
This is the image base of bangumi Akiba Meido Sensou, we detected 48 characters, 2198 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 87 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 185 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 39 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 70 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 169 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 314 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 29 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 16 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 28 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 24 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 37 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 21 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 60 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 31 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 35 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 158 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 16 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 13 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 9 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 12 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 33 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 85 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 34 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 9 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 10 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 8 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 10 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 23 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 21 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 38 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 6 | [Download](30/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 31 | 7 | [Download](31/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 32 | 9 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 16 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 11 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 15 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 7 | [Download](36/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 37 | 28 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 136 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 14 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 9 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 9 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 22 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 10 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 11 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 5 | [Download](45/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 46 | 7 | [Download](46/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 252 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
dim/joke_explaination | 2023-09-21T19:41:19.000Z | [
"region:us"
] | dim | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: url
dtype: string
- name: joke
dtype: string
- name: explaination
dtype: string
splits:
- name: train
num_bytes: 262894
num_examples: 377
download_size: 143161
dataset_size: 262894
---
# Dataset Card for "joke_explaination"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lgaalves__gpt-2-xl_camel-ai-physics | 2023-09-21T19:47:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lgaalves/gpt-2-xl_camel-ai-physics
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/gpt-2-xl_camel-ai-physics](https://huggingface.co/lgaalves/gpt-2-xl_camel-ai-physics)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt-2-xl_camel-ai-physics\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T19:46:11.375703](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt-2-xl_camel-ai-physics/blob/main/results_2023-09-21T19-46-11.375703.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2702104699407836,\n\
\ \"acc_stderr\": 0.03206267177660607,\n \"acc_norm\": 0.2723837797096797,\n\
\ \"acc_norm_stderr\": 0.03206926157494483,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.01484306150773162,\n \"mc2\": 0.3911759819393894,\n\
\ \"mc2_stderr\": 0.014209348354116707\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.27474402730375425,\n \"acc_stderr\": 0.013044617212771227,\n\
\ \"acc_norm\": 0.295221843003413,\n \"acc_norm_stderr\": 0.013329750293382316\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.39842660824536946,\n\
\ \"acc_stderr\": 0.004885735963346902,\n \"acc_norm\": 0.5061740689105756,\n\
\ \"acc_norm_stderr\": 0.004989400984722232\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n\
\ \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.32075471698113206,\n \"acc_stderr\": 0.028727502957880263,\n\
\ \"acc_norm\": 0.32075471698113206,\n \"acc_norm_stderr\": 0.028727502957880263\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.19574468085106383,\n \"acc_stderr\": 0.025937853139977148,\n\
\ \"acc_norm\": 0.19574468085106383,\n \"acc_norm_stderr\": 0.025937853139977148\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378947,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378947\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03358618145732523,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03358618145732523\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791516,\n\
\ \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3487179487179487,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.3487179487179487,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958948,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.30825688073394497,\n \"acc_stderr\": 0.019798366698367268,\n \"\
acc_norm\": 0.30825688073394497,\n \"acc_norm_stderr\": 0.019798366698367268\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.21568627450980393,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.189873417721519,\n \"acc_stderr\": 0.025530100460233504,\n \
\ \"acc_norm\": 0.189873417721519,\n \"acc_norm_stderr\": 0.025530100460233504\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13004484304932734,\n\
\ \"acc_stderr\": 0.02257451942417487,\n \"acc_norm\": 0.13004484304932734,\n\
\ \"acc_norm_stderr\": 0.02257451942417487\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.04750458399041693,\n\
\ \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.04750458399041693\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.02934311479809445,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.02934311479809445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.01588988836256049,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.01588988836256049\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044276,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044276\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261469,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261469\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18006430868167203,\n\
\ \"acc_stderr\": 0.021823422857744953,\n \"acc_norm\": 0.18006430868167203,\n\
\ \"acc_norm_stderr\": 0.021823422857744953\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432407,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n\
\ \"acc_stderr\": 0.011064151027165434,\n \"acc_norm\": 0.2503259452411995,\n\
\ \"acc_norm_stderr\": 0.011064151027165434\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.34191176470588236,\n \"acc_stderr\": 0.02881472242225417,\n\
\ \"acc_norm\": 0.34191176470588236,\n \"acc_norm_stderr\": 0.02881472242225417\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24509803921568626,\n \"acc_stderr\": 0.017401816711427657,\n \
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.017401816711427657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.33877551020408164,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.33877551020408164,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.01484306150773162,\n \"mc2\": 0.3911759819393894,\n\
\ \"mc2_stderr\": 0.014209348354116707\n }\n}\n```"
repo_url: https://huggingface.co/lgaalves/gpt-2-xl_camel-ai-physics
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|arc:challenge|25_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hellaswag|10_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T19-46-11.375703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T19-46-11.375703.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T19-46-11.375703.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T19-46-11.375703.parquet'
- config_name: results
data_files:
- split: 2023_09_21T19_46_11.375703
path:
- results_2023-09-21T19-46-11.375703.parquet
- split: latest
path:
- results_2023-09-21T19-46-11.375703.parquet
---
# Dataset Card for Evaluation run of lgaalves/gpt-2-xl_camel-ai-physics
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/gpt-2-xl_camel-ai-physics
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/gpt-2-xl_camel-ai-physics](https://huggingface.co/lgaalves/gpt-2-xl_camel-ai-physics) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt-2-xl_camel-ai-physics",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T19:46:11.375703](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt-2-xl_camel-ai-physics/blob/main/results_2023-09-21T19-46-11.375703.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2702104699407836,
"acc_stderr": 0.03206267177660607,
"acc_norm": 0.2723837797096797,
"acc_norm_stderr": 0.03206926157494483,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.01484306150773162,
"mc2": 0.3911759819393894,
"mc2_stderr": 0.014209348354116707
},
"harness|arc:challenge|25": {
"acc": 0.27474402730375425,
"acc_stderr": 0.013044617212771227,
"acc_norm": 0.295221843003413,
"acc_norm_stderr": 0.013329750293382316
},
"harness|hellaswag|10": {
"acc": 0.39842660824536946,
"acc_stderr": 0.004885735963346902,
"acc_norm": 0.5061740689105756,
"acc_norm_stderr": 0.004989400984722232
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.32075471698113206,
"acc_stderr": 0.028727502957880263,
"acc_norm": 0.32075471698113206,
"acc_norm_stderr": 0.028727502957880263
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.19574468085106383,
"acc_stderr": 0.025937853139977148,
"acc_norm": 0.19574468085106383,
"acc_norm_stderr": 0.025937853139977148
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378947,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378947
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03358618145732523,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03358618145732523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3487179487179487,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.3487179487179487,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958948,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30825688073394497,
"acc_stderr": 0.019798366698367268,
"acc_norm": 0.30825688073394497,
"acc_norm_stderr": 0.019798366698367268
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.189873417721519,
"acc_stderr": 0.025530100460233504,
"acc_norm": 0.189873417721519,
"acc_norm_stderr": 0.025530100460233504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13004484304932734,
"acc_stderr": 0.02257451942417487,
"acc_norm": 0.13004484304932734,
"acc_norm_stderr": 0.02257451942417487
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.3592233009708738,
"acc_stderr": 0.04750458399041693,
"acc_norm": 0.3592233009708738,
"acc_norm_stderr": 0.04750458399041693
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02934311479809445,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02934311479809445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.01588988836256049,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.01588988836256049
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044276,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044276
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261469,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261469
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18006430868167203,
"acc_stderr": 0.021823422857744953,
"acc_norm": 0.18006430868167203,
"acc_norm_stderr": 0.021823422857744953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432407,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2503259452411995,
"acc_stderr": 0.011064151027165434,
"acc_norm": 0.2503259452411995,
"acc_norm_stderr": 0.011064151027165434
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34191176470588236,
"acc_stderr": 0.02881472242225417,
"acc_norm": 0.34191176470588236,
"acc_norm_stderr": 0.02881472242225417
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.017401816711427657,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.017401816711427657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.33877551020408164,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.33877551020408164,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.01484306150773162,
"mc2": 0.3911759819393894,
"mc2_stderr": 0.014209348354116707
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SomeBaryy/Ubuntu23.04 | 2023-09-21T19:46:35.000Z | [
"region:us"
] | SomeBaryy | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_speechlessai__speechless-llama2-dolphin-orca-platypus-13b | 2023-09-21T19:49:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of speechlessai/speechless-llama2-dolphin-orca-platypus-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [speechlessai/speechless-llama2-dolphin-orca-platypus-13b](https://huggingface.co/speechlessai/speechless-llama2-dolphin-orca-platypus-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_speechlessai__speechless-llama2-dolphin-orca-platypus-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T19:47:48.023587](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-llama2-dolphin-orca-platypus-13b/blob/main/results_2023-09-21T19-47-48.023587.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5793187175453398,\n\
\ \"acc_stderr\": 0.03415431351607796,\n \"acc_norm\": 0.5834760701082039,\n\
\ \"acc_norm_stderr\": 0.03413325701237462,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.01615020132132302,\n \"mc2\": 0.43438938554070605,\n\
\ \"mc2_stderr\": 0.014699327672556812\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348902,\n\
\ \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268447\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6204939255128461,\n\
\ \"acc_stderr\": 0.004842723234022031,\n \"acc_norm\": 0.8265285799641505,\n\
\ \"acc_norm_stderr\": 0.003778804474605914\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35978835978835977,\n \"acc_stderr\": 0.024718075944129277,\n \"\
acc_norm\": 0.35978835978835977,\n \"acc_norm_stderr\": 0.024718075944129277\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727062,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727062\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6838709677419355,\n \"acc_stderr\": 0.02645087448904277,\n \"\
acc_norm\": 0.6838709677419355,\n \"acc_norm_stderr\": 0.02645087448904277\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.02502861027671086,\n \
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.02502861027671086\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.781651376146789,\n \"acc_stderr\": 0.017712600528722727,\n \"\
acc_norm\": 0.781651376146789,\n \"acc_norm_stderr\": 0.017712600528722727\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069425,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069425\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7918263090676884,\n\
\ \"acc_stderr\": 0.014518592248904033,\n \"acc_norm\": 0.7918263090676884,\n\
\ \"acc_norm_stderr\": 0.014518592248904033\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879716,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n\
\ \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n\
\ \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.027420477662629245,\n\
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.027420477662629245\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.02685882587948854,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.02685882587948854\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719964,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719964\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255855,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n\
\ \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n\
\ \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03004261583271486,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03004261583271486\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5866013071895425,\n \"acc_stderr\": 0.019922115682786682,\n \
\ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.019922115682786682\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919798,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.01615020132132302,\n \"mc2\": 0.43438938554070605,\n\
\ \"mc2_stderr\": 0.014699327672556812\n }\n}\n```"
repo_url: https://huggingface.co/speechlessai/speechless-llama2-dolphin-orca-platypus-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|arc:challenge|25_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hellaswag|10_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T19-47-48.023587.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T19-47-48.023587.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T19-47-48.023587.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T19-47-48.023587.parquet'
- config_name: results
data_files:
- split: 2023_09_21T19_47_48.023587
path:
- results_2023-09-21T19-47-48.023587.parquet
- split: latest
path:
- results_2023-09-21T19-47-48.023587.parquet
---
# Dataset Card for Evaluation run of speechlessai/speechless-llama2-dolphin-orca-platypus-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/speechlessai/speechless-llama2-dolphin-orca-platypus-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [speechlessai/speechless-llama2-dolphin-orca-platypus-13b](https://huggingface.co/speechlessai/speechless-llama2-dolphin-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_speechlessai__speechless-llama2-dolphin-orca-platypus-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T19:47:48.023587](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-llama2-dolphin-orca-platypus-13b/blob/main/results_2023-09-21T19-47-48.023587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5793187175453398,
"acc_stderr": 0.03415431351607796,
"acc_norm": 0.5834760701082039,
"acc_norm_stderr": 0.03413325701237462,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.01615020132132302,
"mc2": 0.43438938554070605,
"mc2_stderr": 0.014699327672556812
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348902,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268447
},
"harness|hellaswag|10": {
"acc": 0.6204939255128461,
"acc_stderr": 0.004842723234022031,
"acc_norm": 0.8265285799641505,
"acc_norm_stderr": 0.003778804474605914
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35978835978835977,
"acc_stderr": 0.024718075944129277,
"acc_norm": 0.35978835978835977,
"acc_norm_stderr": 0.024718075944129277
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727062,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727062
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.02502861027671086,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.02502861027671086
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.781651376146789,
"acc_stderr": 0.017712600528722727,
"acc_norm": 0.781651376146789,
"acc_norm_stderr": 0.017712600528722727
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069425,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069425
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7918263090676884,
"acc_stderr": 0.014518592248904033,
"acc_norm": 0.7918263090676884,
"acc_norm_stderr": 0.014518592248904033
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4569832402234637,
"acc_stderr": 0.01666049858050917,
"acc_norm": 0.4569832402234637,
"acc_norm_stderr": 0.01666049858050917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.027420477662629245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.027420477662629245
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.02685882587948854,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.02685882587948854
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719964,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719964
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255855,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03004261583271486,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03004261583271486
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.019922115682786682,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.019922115682786682
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919798,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.01615020132132302,
"mc2": 0.43438938554070605,
"mc2_stderr": 0.014699327672556812
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-80K | 2023-09-21T20:13:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of marcchew/Marcoroni-7B-LaMini-80K
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [marcchew/Marcoroni-7B-LaMini-80K](https://huggingface.co/marcchew/Marcoroni-7B-LaMini-80K)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-80K\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T20:12:12.451376](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-80K/blob/main/results_2023-09-21T20-12-12.451376.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24439777883935013,\n\
\ \"acc_stderr\": 0.031195300222398584,\n \"acc_norm\": 0.245630439346714,\n\
\ \"acc_norm_stderr\": 0.03121455158371846,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662574,\n \"mc2\": 0.497138264046056,\n\
\ \"mc2_stderr\": 0.01675032798005169\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22098976109215018,\n \"acc_stderr\": 0.012124929206818258,\n\
\ \"acc_norm\": 0.28754266211604096,\n \"acc_norm_stderr\": 0.013226719056266129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2551284604660426,\n\
\ \"acc_stderr\": 0.004350424750646203,\n \"acc_norm\": 0.2613025293766182,\n\
\ \"acc_norm_stderr\": 0.004384465219070755\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\"\
: 0.15,\n \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.0309528902177499,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.0309528902177499\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.02925282329180362,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.02925282329180362\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.02102067268082791,\n \
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.02102067268082791\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"\
acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16203703703703703,\n \"acc_stderr\": 0.02513045365226846,\n \"\
acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.02513045365226846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.37668161434977576,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662574,\n \"mc2\": 0.497138264046056,\n\
\ \"mc2_stderr\": 0.01675032798005169\n }\n}\n```"
repo_url: https://huggingface.co/marcchew/Marcoroni-7B-LaMini-80K
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|arc:challenge|25_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hellaswag|10_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-12-12.451376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-12-12.451376.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T20-12-12.451376.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T20-12-12.451376.parquet'
- config_name: results
data_files:
- split: 2023_09_21T20_12_12.451376
path:
- results_2023-09-21T20-12-12.451376.parquet
- split: latest
path:
- results_2023-09-21T20-12-12.451376.parquet
---
# Dataset Card for Evaluation run of marcchew/Marcoroni-7B-LaMini-80K
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/marcchew/Marcoroni-7B-LaMini-80K
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [marcchew/Marcoroni-7B-LaMini-80K](https://huggingface.co/marcchew/Marcoroni-7B-LaMini-80K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-80K",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T20:12:12.451376](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-80K/blob/main/results_2023-09-21T20-12-12.451376.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24439777883935013,
"acc_stderr": 0.031195300222398584,
"acc_norm": 0.245630439346714,
"acc_norm_stderr": 0.03121455158371846,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662574,
"mc2": 0.497138264046056,
"mc2_stderr": 0.01675032798005169
},
"harness|arc:challenge|25": {
"acc": 0.22098976109215018,
"acc_stderr": 0.012124929206818258,
"acc_norm": 0.28754266211604096,
"acc_norm_stderr": 0.013226719056266129
},
"harness|hellaswag|10": {
"acc": 0.2551284604660426,
"acc_stderr": 0.004350424750646203,
"acc_norm": 0.2613025293766182,
"acc_norm_stderr": 0.004384465219070755
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0309528902177499,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0309528902177499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945637,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945637
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.02925282329180362,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.02925282329180362
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.02513045365226846,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.02513045365226846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28735632183908044,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.28735632183908044,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.01766784161237899,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.01766784161237899
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662574,
"mc2": 0.497138264046056,
"mc2_stderr": 0.01675032798005169
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pccl-org/formal-logic-simple-order-new-objects-bigger-500 | 2023-09-21T20:20:18.000Z | [
"region:us"
] | pccl-org | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: greater_than
dtype: string
- name: less_than
dtype: string
- name: correct_example
sequence: string
- name: incorrect_example
sequence: string
- name: distance
dtype: int64
- name: index
dtype: int64
splits:
- name: train
num_bytes: 17349731
num_examples: 124750
download_size: 3733412
dataset_size: 17349731
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "formal-logic-simple-order-new-objects-bigger-500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kweimann/poe-learning-layouts | 2023-09-23T12:10:21.000Z | [
"license:mit",
"region:us"
] | kweimann | null | null | null | 0 | 0 | ---
license: mit
---
# Learning layouts in Path of Exile with Vision Transformers: A proof of concept
<video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/650c55bc9169ea73315b6c22/RJ-rTPWwOFUZlA3ydqhZ2.mp4"></video>
Where's the exit? This question often crosses the minds of both newcomers and seasoned players alike. The key lies in understanding the game's layouts, especially during the campaign when taking a wrong turn can significantly slow you down. Our project aims to solve this challenge through machine learning.
We've developed a proof-of-concept for learning layouts in Path of Exile using Vision Transformers. We trained a Vision Transformer to predict the direction of the exit in the A3 Marketplace, relying solely on a video of the minimap. You can see the model in action in the video above: the red arrow indicates the predicted exit direction, while the green arrow shows the actual direction.
Project page: https://github.com/kweimann/poe-learning-layouts |
dim/law_stackexchange | 2023-09-21T20:56:41.000Z | [
"region:us"
] | dim | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: tags
sequence: string
- name: score
dtype: int64
- name: license
dtype: string
- name: link
dtype: string
- name: question_title
dtype: string
- name: question_body
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: body
dtype: string
- name: score
dtype: int64
splits:
- name: train
num_bytes: 95966652
num_examples: 24370
download_size: 53517367
dataset_size: 95966652
---
# Dataset Card for "law_stackexchange"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Lazycuber__L2-7b-Base-Guanaco-Uncensored | 2023-09-21T20:59:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Lazycuber/L2-7b-Base-Guanaco-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lazycuber/L2-7b-Base-Guanaco-Uncensored](https://huggingface.co/Lazycuber/L2-7b-Base-Guanaco-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lazycuber__L2-7b-Base-Guanaco-Uncensored\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T20:58:37.445412](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Base-Guanaco-Uncensored/blob/main/results_2023-09-21T20-58-37.445412.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46891978366173287,\n\
\ \"acc_stderr\": 0.03526325786670738,\n \"acc_norm\": 0.47277267365887526,\n\
\ \"acc_norm_stderr\": 0.03524869113841017,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42967422660946836,\n\
\ \"mc2_stderr\": 0.014075137629554195\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255795,\n\
\ \"acc_norm\": 0.5221843003412969,\n \"acc_norm_stderr\": 0.014597001927076136\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5916152160924119,\n\
\ \"acc_stderr\": 0.00490530437109087,\n \"acc_norm\": 0.7907787293367855,\n\
\ \"acc_norm_stderr\": 0.004059213774735547\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270658,\n\
\ \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270658\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.040925639582376536,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.040925639582376536\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715563,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484865,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484865\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.47419354838709676,\n \"acc_stderr\": 0.02840609505765332,\n \"\
acc_norm\": 0.47419354838709676,\n \"acc_norm_stderr\": 0.02840609505765332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n \"\
acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.48484848484848486,\n \"acc_stderr\": 0.0356071651653106,\n \"\
acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.0356071651653106\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.03340361906276586,\n\
\ \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.03340361906276586\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000756,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000756\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478465,\n\
\ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478465\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6256880733944954,\n \"acc_stderr\": 0.020748959408988306,\n \"\
acc_norm\": 0.6256880733944954,\n \"acc_norm_stderr\": 0.020748959408988306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.553921568627451,\n \"acc_stderr\": 0.03488845451304974,\n\
\ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.03488845451304974\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.620253164556962,\n \"acc_stderr\": 0.031591887529658504,\n \
\ \"acc_norm\": 0.620253164556962,\n \"acc_norm_stderr\": 0.031591887529658504\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6385696040868455,\n\
\ \"acc_stderr\": 0.017179601328900732,\n \"acc_norm\": 0.6385696040868455,\n\
\ \"acc_norm_stderr\": 0.017179601328900732\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\
\ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.48366013071895425,\n \"acc_stderr\": 0.028614624752805413,\n\
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.028614624752805413\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759426,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759426\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3559322033898305,\n\
\ \"acc_stderr\": 0.012228645537277568,\n \"acc_norm\": 0.3559322033898305,\n\
\ \"acc_norm_stderr\": 0.012228645537277568\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.03027332507734576,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.03027332507734576\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4542483660130719,\n \"acc_stderr\": 0.020142974553795195,\n \
\ \"acc_norm\": 0.4542483660130719,\n \"acc_norm_stderr\": 0.020142974553795195\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.047381987035454834,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.047381987035454834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42448979591836733,\n \"acc_stderr\": 0.031642094879429414,\n\
\ \"acc_norm\": 0.42448979591836733,\n \"acc_norm_stderr\": 0.031642094879429414\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824564,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824564\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42967422660946836,\n\
\ \"mc2_stderr\": 0.014075137629554195\n }\n}\n```"
repo_url: https://huggingface.co/Lazycuber/L2-7b-Base-Guanaco-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|arc:challenge|25_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hellaswag|10_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-58-37.445412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T20-58-37.445412.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T20-58-37.445412.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T20-58-37.445412.parquet'
- config_name: results
data_files:
- split: 2023_09_21T20_58_37.445412
path:
- results_2023-09-21T20-58-37.445412.parquet
- split: latest
path:
- results_2023-09-21T20-58-37.445412.parquet
---
# Dataset Card for Evaluation run of Lazycuber/L2-7b-Base-Guanaco-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Lazycuber/L2-7b-Base-Guanaco-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Lazycuber/L2-7b-Base-Guanaco-Uncensored](https://huggingface.co/Lazycuber/L2-7b-Base-Guanaco-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lazycuber__L2-7b-Base-Guanaco-Uncensored",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T20:58:37.445412](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Base-Guanaco-Uncensored/blob/main/results_2023-09-21T20-58-37.445412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46891978366173287,
"acc_stderr": 0.03526325786670738,
"acc_norm": 0.47277267365887526,
"acc_norm_stderr": 0.03524869113841017,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.42967422660946836,
"mc2_stderr": 0.014075137629554195
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255795,
"acc_norm": 0.5221843003412969,
"acc_norm_stderr": 0.014597001927076136
},
"harness|hellaswag|10": {
"acc": 0.5916152160924119,
"acc_stderr": 0.00490530437109087,
"acc_norm": 0.7907787293367855,
"acc_norm_stderr": 0.004059213774735547
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270658,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270658
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.040925639582376536,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.040925639582376536
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484865,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484865
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47419354838709676,
"acc_stderr": 0.02840609505765332,
"acc_norm": 0.47419354838709676,
"acc_norm_stderr": 0.02840609505765332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.0356071651653106,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.0356071651653106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.689119170984456,
"acc_stderr": 0.03340361906276586,
"acc_norm": 0.689119170984456,
"acc_norm_stderr": 0.03340361906276586
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.025174048384000756,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.025174048384000756
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6256880733944954,
"acc_stderr": 0.020748959408988306,
"acc_norm": 0.6256880733944954,
"acc_norm_stderr": 0.020748959408988306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.25,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.03488845451304974,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.03488845451304974
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.620253164556962,
"acc_stderr": 0.031591887529658504,
"acc_norm": 0.620253164556962,
"acc_norm_stderr": 0.031591887529658504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.04356447202665069,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.04356447202665069
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.049111471073657764,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.049111471073657764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6385696040868455,
"acc_stderr": 0.017179601328900732,
"acc_norm": 0.6385696040868455,
"acc_norm_stderr": 0.017179601328900732
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382875,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.028614624752805413,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.028614624752805413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759426,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759426
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3559322033898305,
"acc_stderr": 0.012228645537277568,
"acc_norm": 0.3559322033898305,
"acc_norm_stderr": 0.012228645537277568
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.03027332507734576,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.03027332507734576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4542483660130719,
"acc_stderr": 0.020142974553795195,
"acc_norm": 0.4542483660130719,
"acc_norm_stderr": 0.020142974553795195
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.047381987035454834,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.047381987035454834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42448979591836733,
"acc_stderr": 0.031642094879429414,
"acc_norm": 0.42448979591836733,
"acc_norm_stderr": 0.031642094879429414
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824564,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.42967422660946836,
"mc2_stderr": 0.014075137629554195
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
orderofmagnitude/coT | 2023-09-21T21:30:43.000Z | [
"region:us"
] | orderofmagnitude | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Spuddle/Spuddy | 2023-09-21T21:14:31.000Z | [
"license:openrail",
"region:us"
] | Spuddle | null | null | null | 0 | 0 | ---
license: openrail
---
|
Escobar37/Frizer | 2023-09-21T21:11:01.000Z | [
"license:openrail",
"region:us"
] | Escobar37 | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic | 2023-09-21T21:28:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Voicelab/trurl-2-13b-academic
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Voicelab/trurl-2-13b-academic](https://huggingface.co/Voicelab/trurl-2-13b-academic)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T21:26:52.608718](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic/blob/main/results_2023-09-21T21-26-52.608718.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.552524555901937,\n\
\ \"acc_stderr\": 0.03461609018525162,\n \"acc_norm\": 0.556588349310513,\n\
\ \"acc_norm_stderr\": 0.03459893731628851,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.4346341998016549,\n\
\ \"mc2_stderr\": 0.014484208469957361\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636586,\n\
\ \"acc_norm\": 0.5793515358361775,\n \"acc_norm_stderr\": 0.014426211252508406\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5957976498705437,\n\
\ \"acc_stderr\": 0.0048973407933143795,\n \"acc_norm\": 0.7954590718980283,\n\
\ \"acc_norm_stderr\": 0.0040254139486194\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183238,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\"\
: 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7412844036697248,\n \"acc_stderr\": 0.01877605231961963,\n \"\
acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.01877605231961963\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613538,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613538\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264694,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264694\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700916,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700916\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.719029374201788,\n\
\ \"acc_stderr\": 0.01607312785122122,\n \"acc_norm\": 0.719029374201788,\n\
\ \"acc_norm_stderr\": 0.01607312785122122\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806642,\n\
\ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806642\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n\
\ \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n\
\ \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\
\ \"acc_stderr\": 0.02801365189199507,\n \"acc_norm\": 0.5819935691318328,\n\
\ \"acc_norm_stderr\": 0.02801365189199507\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.027306625297327684,\n\
\ \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.027306625297327684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4002607561929596,\n\
\ \"acc_stderr\": 0.012513582529136213,\n \"acc_norm\": 0.4002607561929596,\n\
\ \"acc_norm_stderr\": 0.012513582529136213\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213535,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213535\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.4346341998016549,\n\
\ \"mc2_stderr\": 0.014484208469957361\n }\n}\n```"
repo_url: https://huggingface.co/Voicelab/trurl-2-13b-academic
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|arc:challenge|25_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hellaswag|10_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-26-52.608718.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T21-26-52.608718.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T21-26-52.608718.parquet'
- config_name: results
data_files:
- split: 2023_09_21T21_26_52.608718
path:
- results_2023-09-21T21-26-52.608718.parquet
- split: latest
path:
- results_2023-09-21T21-26-52.608718.parquet
---
# Dataset Card for Evaluation run of Voicelab/trurl-2-13b-academic
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Voicelab/trurl-2-13b-academic
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Voicelab/trurl-2-13b-academic](https://huggingface.co/Voicelab/trurl-2-13b-academic) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T21:26:52.608718](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic/blob/main/results_2023-09-21T21-26-52.608718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.552524555901937,
"acc_stderr": 0.03461609018525162,
"acc_norm": 0.556588349310513,
"acc_norm_stderr": 0.03459893731628851,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.4346341998016549,
"mc2_stderr": 0.014484208469957361
},
"harness|arc:challenge|25": {
"acc": 0.5392491467576792,
"acc_stderr": 0.014566303676636586,
"acc_norm": 0.5793515358361775,
"acc_norm_stderr": 0.014426211252508406
},
"harness|hellaswag|10": {
"acc": 0.5957976498705437,
"acc_stderr": 0.0048973407933143795,
"acc_norm": 0.7954590718980283,
"acc_norm_stderr": 0.0040254139486194
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183238,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602842,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7412844036697248,
"acc_stderr": 0.01877605231961963,
"acc_norm": 0.7412844036697248,
"acc_norm_stderr": 0.01877605231961963
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613538,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613538
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264694,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264694
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700916,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700916
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.719029374201788,
"acc_stderr": 0.01607312785122122,
"acc_norm": 0.719029374201788,
"acc_norm_stderr": 0.01607312785122122
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.026261677607806642,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.026261677607806642
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106603,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.02801365189199507,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.02801365189199507
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.595679012345679,
"acc_stderr": 0.027306625297327684,
"acc_norm": 0.595679012345679,
"acc_norm_stderr": 0.027306625297327684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4002607561929596,
"acc_stderr": 0.012513582529136213,
"acc_norm": 0.4002607561929596,
"acc_norm_stderr": 0.012513582529136213
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213535,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213535
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.020148939420415745,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.020148939420415745
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.4346341998016549,
"mc2_stderr": 0.014484208469957361
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Xb4con/ric | 2023-09-21T21:59:53.000Z | [
"region:us"
] | Xb4con | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-7b-v12-bf16 | 2023-09-21T22:19:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-openllama-7b-v12-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-openllama-7b-v12-bf16](https://huggingface.co/OpenBuddy/openbuddy-openllama-7b-v12-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-7b-v12-bf16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T22:18:19.303716](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-7b-v12-bf16/blob/main/results_2023-09-21T22-18-19.303716.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.464196699749477,\n\
\ \"acc_stderr\": 0.0353890125219607,\n \"acc_norm\": 0.4671786338895238,\n\
\ \"acc_norm_stderr\": 0.035389940639698324,\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.01609588415538685,\n \"mc2\": 0.4518051777934512,\n\
\ \"mc2_stderr\": 0.01508272631700764\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.386518771331058,\n \"acc_stderr\": 0.01423008476191048,\n\
\ \"acc_norm\": 0.4206484641638225,\n \"acc_norm_stderr\": 0.014426211252508403\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47829117705636326,\n\
\ \"acc_stderr\": 0.00498507609446476,\n \"acc_norm\": 0.6200955984863573,\n\
\ \"acc_norm_stderr\": 0.004843708550386525\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.03714325906302066,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.03714325906302066\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993176,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993176\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400352,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400352\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899208,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5193548387096775,\n\
\ \"acc_stderr\": 0.02842268740431211,\n \"acc_norm\": 0.5193548387096775,\n\
\ \"acc_norm_stderr\": 0.02842268740431211\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.03888176921674098,\n\
\ \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03888176921674098\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6062176165803109,\n \"acc_stderr\": 0.035260770955482405,\n\
\ \"acc_norm\": 0.6062176165803109,\n \"acc_norm_stderr\": 0.035260770955482405\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331803,\n\
\ \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945266,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945266\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566197,\n \
\ \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566197\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6036697247706422,\n \"acc_stderr\": 0.020971469947900532,\n \"\
acc_norm\": 0.6036697247706422,\n \"acc_norm_stderr\": 0.020971469947900532\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828979,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828979\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.553921568627451,\n \"acc_stderr\": 0.034888454513049734,\n \"\
acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.034888454513049734\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6455696202531646,\n \"acc_stderr\": 0.031137304297185812,\n \
\ \"acc_norm\": 0.6455696202531646,\n \"acc_norm_stderr\": 0.031137304297185812\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5336322869955157,\n\
\ \"acc_stderr\": 0.03348180017060306,\n \"acc_norm\": 0.5336322869955157,\n\
\ \"acc_norm_stderr\": 0.03348180017060306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4662576687116564,\n \"acc_stderr\": 0.039194155450484096,\n\
\ \"acc_norm\": 0.4662576687116564,\n \"acc_norm_stderr\": 0.039194155450484096\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.029872577708891176,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.029872577708891176\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6590038314176245,\n\
\ \"acc_stderr\": 0.016951781383223313,\n \"acc_norm\": 0.6590038314176245,\n\
\ \"acc_norm_stderr\": 0.016951781383223313\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187303,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187303\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.028624412550167958,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.028624412550167958\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4662379421221865,\n\
\ \"acc_stderr\": 0.02833327710956279,\n \"acc_norm\": 0.4662379421221865,\n\
\ \"acc_norm_stderr\": 0.02833327710956279\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49382716049382713,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.49382716049382713,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31747066492829207,\n\
\ \"acc_stderr\": 0.011888892068809312,\n \"acc_norm\": 0.31747066492829207,\n\
\ \"acc_norm_stderr\": 0.011888892068809312\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3897058823529412,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.3897058823529412,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4084967320261438,\n \"acc_stderr\": 0.019886221037501876,\n \
\ \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.019886221037501876\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5428571428571428,\n \"acc_stderr\": 0.031891418324213966,\n\
\ \"acc_norm\": 0.5428571428571428,\n \"acc_norm_stderr\": 0.031891418324213966\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5970149253731343,\n\
\ \"acc_stderr\": 0.03468343295111126,\n \"acc_norm\": 0.5970149253731343,\n\
\ \"acc_norm_stderr\": 0.03468343295111126\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.037777988227480165,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.037777988227480165\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.036996580176568775,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.036996580176568775\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.01609588415538685,\n \"mc2\": 0.4518051777934512,\n\
\ \"mc2_stderr\": 0.01508272631700764\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-openllama-7b-v12-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|arc:challenge|25_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hellaswag|10_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-18-19.303716.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-18-19.303716.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T22-18-19.303716.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T22-18-19.303716.parquet'
- config_name: results
data_files:
- split: 2023_09_21T22_18_19.303716
path:
- results_2023-09-21T22-18-19.303716.parquet
- split: latest
path:
- results_2023-09-21T22-18-19.303716.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-openllama-7b-v12-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-openllama-7b-v12-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-openllama-7b-v12-bf16](https://huggingface.co/OpenBuddy/openbuddy-openllama-7b-v12-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-7b-v12-bf16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T22:18:19.303716](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-7b-v12-bf16/blob/main/results_2023-09-21T22-18-19.303716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.464196699749477,
"acc_stderr": 0.0353890125219607,
"acc_norm": 0.4671786338895238,
"acc_norm_stderr": 0.035389940639698324,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.01609588415538685,
"mc2": 0.4518051777934512,
"mc2_stderr": 0.01508272631700764
},
"harness|arc:challenge|25": {
"acc": 0.386518771331058,
"acc_stderr": 0.01423008476191048,
"acc_norm": 0.4206484641638225,
"acc_norm_stderr": 0.014426211252508403
},
"harness|hellaswag|10": {
"acc": 0.47829117705636326,
"acc_stderr": 0.00498507609446476,
"acc_norm": 0.6200955984863573,
"acc_norm_stderr": 0.004843708550386525
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.03714325906302066,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.03714325906302066
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993176,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993176
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400352,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400352
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899208,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.02842268740431211,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.02842268740431211
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.03888176921674098,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.03888176921674098
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6062176165803109,
"acc_stderr": 0.035260770955482405,
"acc_norm": 0.6062176165803109,
"acc_norm_stderr": 0.035260770955482405
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4128205128205128,
"acc_stderr": 0.024962683564331803,
"acc_norm": 0.4128205128205128,
"acc_norm_stderr": 0.024962683564331803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945266,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945266
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566197,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566197
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6036697247706422,
"acc_stderr": 0.020971469947900532,
"acc_norm": 0.6036697247706422,
"acc_norm_stderr": 0.020971469947900532
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.034888454513049734,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.034888454513049734
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6455696202531646,
"acc_stderr": 0.031137304297185812,
"acc_norm": 0.6455696202531646,
"acc_norm_stderr": 0.031137304297185812
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5336322869955157,
"acc_stderr": 0.03348180017060306,
"acc_norm": 0.5336322869955157,
"acc_norm_stderr": 0.03348180017060306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4662576687116564,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.4662576687116564,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.029872577708891176,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.029872577708891176
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6590038314176245,
"acc_stderr": 0.016951781383223313,
"acc_norm": 0.6590038314176245,
"acc_norm_stderr": 0.016951781383223313
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187303,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.028624412550167958,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.028624412550167958
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4662379421221865,
"acc_stderr": 0.02833327710956279,
"acc_norm": 0.4662379421221865,
"acc_norm_stderr": 0.02833327710956279
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49382716049382713,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.49382716049382713,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31747066492829207,
"acc_stderr": 0.011888892068809312,
"acc_norm": 0.31747066492829207,
"acc_norm_stderr": 0.011888892068809312
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3897058823529412,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.3897058823529412,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4084967320261438,
"acc_stderr": 0.019886221037501876,
"acc_norm": 0.4084967320261438,
"acc_norm_stderr": 0.019886221037501876
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794915,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794915
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5428571428571428,
"acc_stderr": 0.031891418324213966,
"acc_norm": 0.5428571428571428,
"acc_norm_stderr": 0.031891418324213966
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5970149253731343,
"acc_stderr": 0.03468343295111126,
"acc_norm": 0.5970149253731343,
"acc_norm_stderr": 0.03468343295111126
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.037777988227480165,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.037777988227480165
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.036996580176568775,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.036996580176568775
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.01609588415538685,
"mc2": 0.4518051777934512,
"mc2_stderr": 0.01508272631700764
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dharun2049/dermaflow-v1 | 2023-09-21T23:40:42.000Z | [
"task_categories:image-classification",
"size_categories:n<1K",
"license:apache-2.0",
"biology",
"region:us"
] | dharun2049 | null | null | null | 0 | 0 | ---
task_categories:
- image-classification
license: apache-2.0
tags:
- biology
size_categories:
- n<1K
---
# AutoTrain Dataset for project: vision-transformer
## Dataset Description
This dataset has been automatically processed by AutoTrain for project vision-transformer.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<224x224 RGB PIL image>",
"target": 0
},
{
"image": "<224x224 RGB PIL image>",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['benign', 'malignant'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 397 |
| valid | 101 | |
open-llm-leaderboard/details_Secbone__llama-33B-instructed | 2023-09-21T22:25:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Secbone/llama-33B-instructed
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Secbone/llama-33B-instructed](https://huggingface.co/Secbone/llama-33B-instructed)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Secbone__llama-33B-instructed\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T22:23:50.443527](https://huggingface.co/datasets/open-llm-leaderboard/details_Secbone__llama-33B-instructed/blob/main/results_2023-09-21T22-23-50.443527.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6066036411744823,\n\
\ \"acc_stderr\": 0.03349129169090783,\n \"acc_norm\": 0.6100871650568858,\n\
\ \"acc_norm_stderr\": 0.033467495423430915,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.01607750926613303,\n \"mc2\": 0.4411804826699554,\n\
\ \"mc2_stderr\": 0.01536785007781839\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303101,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756557\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6800438159729137,\n\
\ \"acc_stderr\": 0.004655059308602616,\n \"acc_norm\": 0.8616809400517825,\n\
\ \"acc_norm_stderr\": 0.0034452899250117337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526066,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526066\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115978,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115978\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n\
\ \"acc_stderr\": 0.02573654274559453,\n \"acc_norm\": 0.7129032258064516,\n\
\ \"acc_norm_stderr\": 0.02573654274559453\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919432,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919432\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215638,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215638\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200154,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640773,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640773\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8270042194092827,\n \"acc_stderr\": 0.02462156286676842,\n \
\ \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.02462156286676842\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575498,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575498\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847835,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847835\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\
\ \"acc_stderr\": 0.014419123980931899,\n \"acc_norm\": 0.7956577266922095,\n\
\ \"acc_norm_stderr\": 0.014419123980931899\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n\
\ \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n\
\ \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.027305308076274695,\n\
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.027305308076274695\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5013037809647979,\n\
\ \"acc_stderr\": 0.012770192691057112,\n \"acc_norm\": 0.5013037809647979,\n\
\ \"acc_norm_stderr\": 0.012770192691057112\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.02993534270787774,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.02993534270787774\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.01607750926613303,\n \"mc2\": 0.4411804826699554,\n\
\ \"mc2_stderr\": 0.01536785007781839\n }\n}\n```"
repo_url: https://huggingface.co/Secbone/llama-33B-instructed
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|arc:challenge|25_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hellaswag|10_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-23-50.443527.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-23-50.443527.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T22-23-50.443527.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T22-23-50.443527.parquet'
- config_name: results
data_files:
- split: 2023_09_21T22_23_50.443527
path:
- results_2023-09-21T22-23-50.443527.parquet
- split: latest
path:
- results_2023-09-21T22-23-50.443527.parquet
---
# Dataset Card for Evaluation run of Secbone/llama-33B-instructed
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Secbone/llama-33B-instructed
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Secbone/llama-33B-instructed](https://huggingface.co/Secbone/llama-33B-instructed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Secbone__llama-33B-instructed",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T22:23:50.443527](https://huggingface.co/datasets/open-llm-leaderboard/details_Secbone__llama-33B-instructed/blob/main/results_2023-09-21T22-23-50.443527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6066036411744823,
"acc_stderr": 0.03349129169090783,
"acc_norm": 0.6100871650568858,
"acc_norm_stderr": 0.033467495423430915,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.01607750926613303,
"mc2": 0.4411804826699554,
"mc2_stderr": 0.01536785007781839
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303101,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756557
},
"harness|hellaswag|10": {
"acc": 0.6800438159729137,
"acc_stderr": 0.004655059308602616,
"acc_norm": 0.8616809400517825,
"acc_norm_stderr": 0.0034452899250117337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115978,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115978
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7129032258064516,
"acc_stderr": 0.02573654274559453,
"acc_norm": 0.7129032258064516,
"acc_norm_stderr": 0.02573654274559453
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919432,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919432
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215638,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215638
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200154,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640773,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640773
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.02462156286676842,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.02462156286676842
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575498,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575498
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847835,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847835
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.014419123980931899,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.014419123980931899
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4569832402234637,
"acc_stderr": 0.01666049858050917,
"acc_norm": 0.4569832402234637,
"acc_norm_stderr": 0.01666049858050917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.027305308076274695,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.027305308076274695
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5013037809647979,
"acc_stderr": 0.012770192691057112,
"acc_norm": 0.5013037809647979,
"acc_norm_stderr": 0.012770192691057112
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.02993534270787774,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.02993534270787774
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.01607750926613303,
"mc2": 0.4411804826699554,
"mc2_stderr": 0.01536785007781839
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_codeparrot__codeparrot | 2023-09-21T22:36:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of codeparrot/codeparrot
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codeparrot/codeparrot](https://huggingface.co/codeparrot/codeparrot) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codeparrot__codeparrot\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T22:35:18.428619](https://huggingface.co/datasets/open-llm-leaderboard/details_codeparrot__codeparrot/blob/main/results_2023-09-21T22-35-18.428619.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25454853165365626,\n\
\ \"acc_stderr\": 0.031506316972577275,\n \"acc_norm\": 0.25535700625181734,\n\
\ \"acc_norm_stderr\": 0.03152128733100944,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133022,\n \"mc2\": 0.5086962165189903,\n\
\ \"mc2_stderr\": 0.015980319470603785\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.17918088737201365,\n \"acc_stderr\": 0.011207045216615665,\n\
\ \"acc_norm\": 0.2167235494880546,\n \"acc_norm_stderr\": 0.012040156713481192\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27325234017128064,\n\
\ \"acc_stderr\": 0.004447185883327457,\n \"acc_norm\": 0.28340967934674366,\n\
\ \"acc_norm_stderr\": 0.004497325533959625\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.036333844140734636,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.036333844140734636\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533156,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533156\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108614,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108614\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.030976692998534432,\n\
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.030976692998534432\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626303,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626303\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.034356961683613546,\n\
\ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.034356961683613546\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.02350757902064535,\n \
\ \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.02350757902064535\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882378,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3504587155963303,\n \"acc_stderr\": 0.020456077599824457,\n \"\
acc_norm\": 0.3504587155963303,\n \"acc_norm_stderr\": 0.020456077599824457\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425173,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425173\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.11210762331838565,\n\
\ \"acc_stderr\": 0.021174894206346103,\n \"acc_norm\": 0.11210762331838565,\n\
\ \"acc_norm_stderr\": 0.021174894206346103\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.17557251908396945,\n \"acc_stderr\": 0.033368203384760736,\n\
\ \"acc_norm\": 0.17557251908396945,\n \"acc_norm_stderr\": 0.033368203384760736\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.038935425188248475,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.038935425188248475\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3883495145631068,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.3883495145631068,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18376068376068377,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.18376068376068377,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21328224776500637,\n\
\ \"acc_stderr\": 0.014648172749593527,\n \"acc_norm\": 0.21328224776500637,\n\
\ \"acc_norm_stderr\": 0.014648172749593527\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.02218347766841286,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.02218347766841286\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.01421957078810399,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.01421957078810399\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.17363344051446947,\n\
\ \"acc_stderr\": 0.02151405158597041,\n \"acc_norm\": 0.17363344051446947,\n\
\ \"acc_norm_stderr\": 0.02151405158597041\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543343,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543343\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880596,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880596\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.011005971399927242,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.011005971399927242\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2610294117647059,\n \"acc_stderr\": 0.02667925227010312,\n\
\ \"acc_norm\": 0.2610294117647059,\n \"acc_norm_stderr\": 0.02667925227010312\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987862,\n \
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987862\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.02688214492230774,\n\
\ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.02688214492230774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.02992941540834838,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.02992941540834838\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133022,\n \"mc2\": 0.5086962165189903,\n\
\ \"mc2_stderr\": 0.015980319470603785\n }\n}\n```"
repo_url: https://huggingface.co/codeparrot/codeparrot
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|arc:challenge|25_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hellaswag|10_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-35-18.428619.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T22-35-18.428619.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T22-35-18.428619.parquet'
- config_name: results
data_files:
- split: 2023_09_21T22_35_18.428619
path:
- results_2023-09-21T22-35-18.428619.parquet
- split: latest
path:
- results_2023-09-21T22-35-18.428619.parquet
---
# Dataset Card for Evaluation run of codeparrot/codeparrot
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codeparrot/codeparrot
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codeparrot/codeparrot](https://huggingface.co/codeparrot/codeparrot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codeparrot__codeparrot",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T22:35:18.428619](https://huggingface.co/datasets/open-llm-leaderboard/details_codeparrot__codeparrot/blob/main/results_2023-09-21T22-35-18.428619.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25454853165365626,
"acc_stderr": 0.031506316972577275,
"acc_norm": 0.25535700625181734,
"acc_norm_stderr": 0.03152128733100944,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133022,
"mc2": 0.5086962165189903,
"mc2_stderr": 0.015980319470603785
},
"harness|arc:challenge|25": {
"acc": 0.17918088737201365,
"acc_stderr": 0.011207045216615665,
"acc_norm": 0.2167235494880546,
"acc_norm_stderr": 0.012040156713481192
},
"harness|hellaswag|10": {
"acc": 0.27325234017128064,
"acc_stderr": 0.004447185883327457,
"acc_norm": 0.28340967934674366,
"acc_norm_stderr": 0.004497325533959625
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.036333844140734636,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.036333844140734636
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533156,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533156
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108614,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108614
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.030976692998534432,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.030976692998534432
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604675,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604675
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626303,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626303
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.034356961683613546,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.034356961683613546
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.02350757902064535,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.02350757902064535
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882378,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3504587155963303,
"acc_stderr": 0.020456077599824457,
"acc_norm": 0.3504587155963303,
"acc_norm_stderr": 0.020456077599824457
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02933116229425173,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02933116229425173
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.11210762331838565,
"acc_stderr": 0.021174894206346103,
"acc_norm": 0.11210762331838565,
"acc_norm_stderr": 0.021174894206346103
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.17557251908396945,
"acc_stderr": 0.033368203384760736,
"acc_norm": 0.17557251908396945,
"acc_norm_stderr": 0.033368203384760736
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.038935425188248475,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.038935425188248475
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.3883495145631068,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.3883495145631068,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18376068376068377,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.18376068376068377,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21328224776500637,
"acc_stderr": 0.014648172749593527,
"acc_norm": 0.21328224776500637,
"acc_norm_stderr": 0.014648172749593527
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.02218347766841286,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.02218347766841286
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.01421957078810399,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.01421957078810399
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.17363344051446947,
"acc_stderr": 0.02151405158597041,
"acc_norm": 0.17363344051446947,
"acc_norm_stderr": 0.02151405158597041
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543343,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543343
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880596,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880596
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927242,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927242
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2610294117647059,
"acc_stderr": 0.02667925227010312,
"acc_norm": 0.2610294117647059,
"acc_norm_stderr": 0.02667925227010312
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987862,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.02992941540834838,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.02992941540834838
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.030709824050565274,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.030709824050565274
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133022,
"mc2": 0.5086962165189903,
"mc2_stderr": 0.015980319470603785
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dharun2049/autotrain-data-dermaflow | 2023-09-21T22:41:46.000Z | [
"region:us"
] | dharun2049 | null | null | null | 0 | 0 | Entry not found |
dharun2049/autotrain-data-derma-flow | 2023-09-21T22:44:01.000Z | [
"region:us"
] | dharun2049 | null | null | null | 0 | 0 | Entry not found |
dharun2049/autotrain-data-skinnnnnnn | 2023-09-21T22:51:24.000Z | [
"task_categories:image-classification",
"region:us"
] | dharun2049 | null | null | null | 0 | 0 | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: skinnnnnnn
## Dataset Description
This dataset has been automatically processed by AutoTrain for project skinnnnnnn.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<224x224 RGB PIL image>",
"target": 0
},
{
"image": "<224x224 RGB PIL image>",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['benign', 'malignant'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 397 |
| valid | 101 |
|
dharun2049/autotrain-data-skinwalker | 2023-09-21T22:53:29.000Z | [
"region:us"
] | dharun2049 | null | null | null | 0 | 0 | Entry not found |
dharun2049/autotrain-data-bingbongdomh | 2023-09-21T23:01:44.000Z | [
"task_categories:image-classification",
"region:us"
] | dharun2049 | null | null | null | 0 | 0 | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: bingbongdomh
## Dataset Description
This dataset has been automatically processed by AutoTrain for project bingbongdomh.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<224x224 RGB PIL image>",
"target": 0
},
{
"image": "<224x224 RGB PIL image>",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['benign', 'malignant'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 397 |
| valid | 101 |
|
open-llm-leaderboard/details_speechlessai__speechless-codellama-airoboros-orca-platypus-13b | 2023-09-21T22:56:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of speechlessai/speechless-codellama-airoboros-orca-platypus-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [speechlessai/speechless-codellama-airoboros-orca-platypus-13b](https://huggingface.co/speechlessai/speechless-codellama-airoboros-orca-platypus-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_speechlessai__speechless-codellama-airoboros-orca-platypus-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T22:55:13.794289](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-codellama-airoboros-orca-platypus-13b/blob/main/results_2023-09-21T22-55-13.794289.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43226677376062445,\n\
\ \"acc_stderr\": 0.03529231182645031,\n \"acc_norm\": 0.4360535332223343,\n\
\ \"acc_norm_stderr\": 0.03528955553441052,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766367,\n \"mc2\": 0.4087530329179874,\n\
\ \"mc2_stderr\": 0.014474641066490662\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4104095563139932,\n \"acc_stderr\": 0.014374922192642664,\n\
\ \"acc_norm\": 0.44880546075085326,\n \"acc_norm_stderr\": 0.014534599585097662\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49193387771360286,\n\
\ \"acc_stderr\": 0.004989132075598775,\n \"acc_norm\": 0.676956781517626,\n\
\ \"acc_norm_stderr\": 0.004666833452796198\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4528301886792453,\n \"acc_stderr\": 0.03063562795796182,\n\
\ \"acc_norm\": 0.4528301886792453,\n \"acc_norm_stderr\": 0.03063562795796182\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.3958333333333333,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.03692820767264867,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.03692820767264867\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.041443118108781526,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.041443118108781526\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842509,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842509\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4806451612903226,\n\
\ \"acc_stderr\": 0.02842268740431211,\n \"acc_norm\": 0.4806451612903226,\n\
\ \"acc_norm_stderr\": 0.02842268740431211\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868407,\n\
\ \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868407\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n\
\ \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.538860103626943,\n \"acc_stderr\": 0.035975244117345775,\n\
\ \"acc_norm\": 0.538860103626943,\n \"acc_norm_stderr\": 0.035975244117345775\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.02428314052946729,\n \
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.02428314052946729\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230203,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230203\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40336134453781514,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.40336134453781514,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5467889908256881,\n \"acc_stderr\": 0.021343255165546034,\n \"\
acc_norm\": 0.5467889908256881,\n \"acc_norm_stderr\": 0.021343255165546034\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.553921568627451,\n \"acc_stderr\": 0.034888454513049734,\n \"\
acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.034888454513049734\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5654008438818565,\n \"acc_stderr\": 0.03226759995510145,\n \
\ \"acc_norm\": 0.5654008438818565,\n \"acc_norm_stderr\": 0.03226759995510145\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4663677130044843,\n\
\ \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.4663677130044843,\n\
\ \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009223,\n\
\ \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009223\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6965811965811965,\n\
\ \"acc_stderr\": 0.030118210106942638,\n \"acc_norm\": 0.6965811965811965,\n\
\ \"acc_norm_stderr\": 0.030118210106942638\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5312899106002554,\n\
\ \"acc_stderr\": 0.01784491809046855,\n \"acc_norm\": 0.5312899106002554,\n\
\ \"acc_norm_stderr\": 0.01784491809046855\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26033519553072626,\n\
\ \"acc_stderr\": 0.014676252009319463,\n \"acc_norm\": 0.26033519553072626,\n\
\ \"acc_norm_stderr\": 0.014676252009319463\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.02807415894760066,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.02807415894760066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4758842443729904,\n\
\ \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.4758842443729904,\n\
\ \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.44135802469135804,\n \"acc_stderr\": 0.027628737155668773,\n\
\ \"acc_norm\": 0.44135802469135804,\n \"acc_norm_stderr\": 0.027628737155668773\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.0289473388516141,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.0289473388516141\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.303129074315515,\n\
\ \"acc_stderr\": 0.011738669951254305,\n \"acc_norm\": 0.303129074315515,\n\
\ \"acc_norm_stderr\": 0.011738669951254305\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.39215686274509803,\n \"acc_stderr\": 0.01975172650876263,\n \
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.01975172650876263\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48258706467661694,\n\
\ \"acc_stderr\": 0.03533389234739245,\n \"acc_norm\": 0.48258706467661694,\n\
\ \"acc_norm_stderr\": 0.03533389234739245\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.03820042586602967,\n\
\ \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.03820042586602967\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766367,\n \"mc2\": 0.4087530329179874,\n\
\ \"mc2_stderr\": 0.014474641066490662\n }\n}\n```"
repo_url: https://huggingface.co/speechlessai/speechless-codellama-airoboros-orca-platypus-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|arc:challenge|25_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hellaswag|10_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-55-13.794289.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T22-55-13.794289.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T22-55-13.794289.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T22-55-13.794289.parquet'
- config_name: results
data_files:
- split: 2023_09_21T22_55_13.794289
path:
- results_2023-09-21T22-55-13.794289.parquet
- split: latest
path:
- results_2023-09-21T22-55-13.794289.parquet
---
# Dataset Card for Evaluation run of speechlessai/speechless-codellama-airoboros-orca-platypus-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/speechlessai/speechless-codellama-airoboros-orca-platypus-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [speechlessai/speechless-codellama-airoboros-orca-platypus-13b](https://huggingface.co/speechlessai/speechless-codellama-airoboros-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_speechlessai__speechless-codellama-airoboros-orca-platypus-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T22:55:13.794289](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-codellama-airoboros-orca-platypus-13b/blob/main/results_2023-09-21T22-55-13.794289.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43226677376062445,
"acc_stderr": 0.03529231182645031,
"acc_norm": 0.4360535332223343,
"acc_norm_stderr": 0.03528955553441052,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766367,
"mc2": 0.4087530329179874,
"mc2_stderr": 0.014474641066490662
},
"harness|arc:challenge|25": {
"acc": 0.4104095563139932,
"acc_stderr": 0.014374922192642664,
"acc_norm": 0.44880546075085326,
"acc_norm_stderr": 0.014534599585097662
},
"harness|hellaswag|10": {
"acc": 0.49193387771360286,
"acc_stderr": 0.004989132075598775,
"acc_norm": 0.676956781517626,
"acc_norm_stderr": 0.004666833452796198
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4528301886792453,
"acc_stderr": 0.03063562795796182,
"acc_norm": 0.4528301886792453,
"acc_norm_stderr": 0.03063562795796182
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.03692820767264867,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.03692820767264867
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.041443118108781526,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.041443118108781526
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842509,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842509
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4806451612903226,
"acc_stderr": 0.02842268740431211,
"acc_norm": 0.4806451612903226,
"acc_norm_stderr": 0.02842268740431211
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868407,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868407
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.538860103626943,
"acc_stderr": 0.035975244117345775,
"acc_norm": 0.538860103626943,
"acc_norm_stderr": 0.035975244117345775
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.02428314052946729,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.02428314052946729
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230203,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230203
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40336134453781514,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.40336134453781514,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5467889908256881,
"acc_stderr": 0.021343255165546034,
"acc_norm": 0.5467889908256881,
"acc_norm_stderr": 0.021343255165546034
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.034888454513049734,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.034888454513049734
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5654008438818565,
"acc_stderr": 0.03226759995510145,
"acc_norm": 0.5654008438818565,
"acc_norm_stderr": 0.03226759995510145
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4663677130044843,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.4663677130044843,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3969465648854962,
"acc_stderr": 0.04291135671009223,
"acc_norm": 0.3969465648854962,
"acc_norm_stderr": 0.04291135671009223
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6965811965811965,
"acc_stderr": 0.030118210106942638,
"acc_norm": 0.6965811965811965,
"acc_norm_stderr": 0.030118210106942638
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5312899106002554,
"acc_stderr": 0.01784491809046855,
"acc_norm": 0.5312899106002554,
"acc_norm_stderr": 0.01784491809046855
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26033519553072626,
"acc_stderr": 0.014676252009319463,
"acc_norm": 0.26033519553072626,
"acc_norm_stderr": 0.014676252009319463
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.02807415894760066,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.02807415894760066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4758842443729904,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.4758842443729904,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.44135802469135804,
"acc_stderr": 0.027628737155668773,
"acc_norm": 0.44135802469135804,
"acc_norm_stderr": 0.027628737155668773
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.0289473388516141,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.0289473388516141
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.303129074315515,
"acc_stderr": 0.011738669951254305,
"acc_norm": 0.303129074315515,
"acc_norm_stderr": 0.011738669951254305
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.01975172650876263,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.01975172650876263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.047764491623961985,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.047764491623961985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48258706467661694,
"acc_stderr": 0.03533389234739245,
"acc_norm": 0.48258706467661694,
"acc_norm_stderr": 0.03533389234739245
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.03820042586602967,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.03820042586602967
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766367,
"mc2": 0.4087530329179874,
"mc2_stderr": 0.014474641066490662
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SohanAnisetty/HumanMotionSMPL | 2023-09-21T23:04:11.000Z | [
"region:us"
] | SohanAnisetty | null | null | null | 0 | 0 | Entry not found |
dharun2049/autotrain-data-vit-skin-derna | 2023-09-21T23:19:24.000Z | [
"task_categories:image-classification",
"region:us"
] | dharun2049 | null | null | null | 0 | 0 | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: vit-skin-derna
## Dataset Description
This dataset has been automatically processed by AutoTrain for project vit-skin-derna.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<32x32 RGB PIL image>",
"target": 4
},
{
"image": "<32x32 RGB PIL image>",
"target": 8
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 40000 |
| valid | 10000 |
|
open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST | 2023-09-21T23:19:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/Llama-2-13b-FINETUNE4_TEST](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-21T23:17:56.003321](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST/blob/main/results_2023-09-21T23-17-56.003321.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5601115387662375,\n\
\ \"acc_stderr\": 0.034549712275199894,\n \"acc_norm\": 0.5644236356299138,\n\
\ \"acc_norm_stderr\": 0.03453167763754593,\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842895,\n \"mc2\": 0.3913965666576138,\n\
\ \"mc2_stderr\": 0.014096837740998912\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \
\ \"acc_norm\": 0.5477815699658704,\n \"acc_norm_stderr\": 0.014544519880633827\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6085441147181836,\n\
\ \"acc_stderr\": 0.0048707850367082925,\n \"acc_norm\": 0.8151762597092213,\n\
\ \"acc_norm_stderr\": 0.0038736123391606564\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776292,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776292\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425075,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425075\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n\
\ \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n\
\ \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.03567969772268049,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.03567969772268049\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.03274287914026866,\n \"acc_norm\"\
: 0.696969696969697,\n \"acc_norm_stderr\": 0.03274287914026866\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000756,\n \
\ \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000756\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712996,\n \
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712996\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790236,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790236\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654362,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599736,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599736\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.0150463018466918,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.0150463018466918\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208173,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208173\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.02803609227389177,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.02803609227389177\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402616,\n\
\ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402616\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n\
\ \"acc_stderr\": 0.012665568135455326,\n \"acc_norm\": 0.4361147327249022,\n\
\ \"acc_norm_stderr\": 0.012665568135455326\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5686274509803921,\n \"acc_stderr\": 0.020036393768352635,\n \
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.020036393768352635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.03789134424611552,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.03789134424611552\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842895,\n \"mc2\": 0.3913965666576138,\n\
\ \"mc2_stderr\": 0.014096837740998912\n }\n}\n```"
repo_url: https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|arc:challenge|25_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hellaswag|10_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T23-17-56.003321.parquet'
- config_name: results
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- results_2023-09-21T23-17-56.003321.parquet
- split: latest
path:
- results_2023-09-21T23-17-56.003321.parquet
---
# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/Llama-2-13b-FINETUNE4_TEST](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-21T23:17:56.003321](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST/blob/main/results_2023-09-21T23-17-56.003321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5601115387662375,
"acc_stderr": 0.034549712275199894,
"acc_norm": 0.5644236356299138,
"acc_norm_stderr": 0.03453167763754593,
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842895,
"mc2": 0.3913965666576138,
"mc2_stderr": 0.014096837740998912
},
"harness|arc:challenge|25": {
"acc": 0.5,
"acc_stderr": 0.014611390804670088,
"acc_norm": 0.5477815699658704,
"acc_norm_stderr": 0.014544519880633827
},
"harness|hellaswag|10": {
"acc": 0.6085441147181836,
"acc_stderr": 0.0048707850367082925,
"acc_norm": 0.8151762597092213,
"acc_norm_stderr": 0.0038736123391606564
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776292,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425075,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425075
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03274287914026866,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03274287914026866
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000756,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000756
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712996,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712996
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790236,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790236
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654362,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599736,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599736
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.02537213967172293,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.02537213967172293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.0150463018466918,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.0150463018466918
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208173,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208173
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.02803609227389177,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.02803609227389177
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.027044538138402616,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.027044538138402616
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.012665568135455326,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.012665568135455326
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.020036393768352635,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.020036393768352635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.03789134424611552,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.03789134424611552
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842895,
"mc2": 0.3913965666576138,
"mc2_stderr": 0.014096837740998912
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dharun2049/autotrain-data-im-gonna-kms | 2023-09-21T23:21:59.000Z | [
"region:us"
] | dharun2049 | null | null | null | 0 | 0 | Entry not found |
BangumiBase/donttoywithmemissnagatoro | 2023-09-29T10:12:18.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Don't Toy With Me, Miss Nagatoro
This is the image base of bangumi Don't Toy With Me, Miss Nagatoro, we detected 19 characters, 3059 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 43 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 34 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 1240 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 28 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 12 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 42 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 28 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 16 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 1114 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 9 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 15 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 144 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 11 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 87 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 9 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 11 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 83 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 12 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 121 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
danielz01/adv_glue_plus_plus | 2023-09-22T00:20:14.000Z | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"region:us"
] | danielz01 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: "advglue_plus_plus.json"
field: ["sst2", "qqp", "mnli", "mnli-mm", "qnli", "rte"]
license: mit
task_categories:
- text-classification
language:
- en
pretty_name: AdvGLUE++ Test Data
size_categories:
- 10K<n<100K
--- |
baran341134/PAPATYA | 2023-09-29T16:58:03.000Z | [
"region:us"
] | baran341134 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b | 2023-09-22T00:11:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of zarakiquemparte/kuchiki-1.1-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/kuchiki-1.1-l2-7b](https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T00:09:37.890921](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b/blob/main/results_2023-09-22T00-09-37.890921.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48408052324556666,\n\
\ \"acc_stderr\": 0.035282395376323994,\n \"acc_norm\": 0.4874623541482438,\n\
\ \"acc_norm_stderr\": 0.035268930646030636,\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494884,\n \"mc2\": 0.499579377079004,\n\
\ \"mc2_stderr\": 0.015585935701840655\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5273037542662116,\n \"acc_stderr\": 0.014589589101985996,\n\
\ \"acc_norm\": 0.5418088737201365,\n \"acc_norm_stderr\": 0.0145602203087147\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5950009958175663,\n\
\ \"acc_stderr\": 0.0048988860806879276,\n \"acc_norm\": 0.7800238996215894,\n\
\ \"acc_norm_stderr\": 0.004133835786651186\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808093,\n\
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808093\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484865,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484865\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.039701582732351734,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.039701582732351734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5258064516129032,\n \"acc_stderr\": 0.028406095057653326,\n \"\
acc_norm\": 0.5258064516129032,\n \"acc_norm_stderr\": 0.028406095057653326\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n \"\
acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6060606060606061,\n \"acc_stderr\": 0.03481285338232963,\n \"\
acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03481285338232963\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.03275264467791516,\n\
\ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.03275264467791516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240634,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240634\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145658,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145658\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6678899082568808,\n \"acc_stderr\": 0.02019268298542333,\n \"\
acc_norm\": 0.6678899082568808,\n \"acc_norm_stderr\": 0.02019268298542333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03214952147802751,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03214952147802751\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6421568627450981,\n \"acc_stderr\": 0.03364487286088299,\n \"\
acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.03364487286088299\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105296,\n \
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105296\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.029872577708891186,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.029872577708891186\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6628352490421456,\n\
\ \"acc_stderr\": 0.01690520742080355,\n \"acc_norm\": 0.6628352490421456,\n\
\ \"acc_norm_stderr\": 0.01690520742080355\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\
\ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n\
\ \"acc_stderr\": 0.015060381730018096,\n \"acc_norm\": 0.28268156424581004,\n\
\ \"acc_norm_stderr\": 0.015060381730018096\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.02809924077580956,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.02809924077580956\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.027756535257347663,\n\
\ \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.027756535257347663\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3624511082138201,\n\
\ \"acc_stderr\": 0.012277512533252486,\n \"acc_norm\": 0.3624511082138201,\n\
\ \"acc_norm_stderr\": 0.012277512533252486\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4395424836601307,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.4395424836601307,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n\
\ \"acc_stderr\": 0.0343751933733825,\n \"acc_norm\": 0.6169154228855721,\n\
\ \"acc_norm_stderr\": 0.0343751933733825\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494884,\n \"mc2\": 0.499579377079004,\n\
\ \"mc2_stderr\": 0.015585935701840655\n }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|arc:challenge|25_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hellaswag|10_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-09-37.890921.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T00-09-37.890921.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T00-09-37.890921.parquet'
- config_name: results
data_files:
- split: 2023_09_22T00_09_37.890921
path:
- results_2023-09-22T00-09-37.890921.parquet
- split: latest
path:
- results_2023-09-22T00-09-37.890921.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-1.1-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/kuchiki-1.1-l2-7b](https://huggingface.co/zarakiquemparte/kuchiki-1.1-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T00:09:37.890921](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__kuchiki-1.1-l2-7b/blob/main/results_2023-09-22T00-09-37.890921.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48408052324556666,
"acc_stderr": 0.035282395376323994,
"acc_norm": 0.4874623541482438,
"acc_norm_stderr": 0.035268930646030636,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494884,
"mc2": 0.499579377079004,
"mc2_stderr": 0.015585935701840655
},
"harness|arc:challenge|25": {
"acc": 0.5273037542662116,
"acc_stderr": 0.014589589101985996,
"acc_norm": 0.5418088737201365,
"acc_norm_stderr": 0.0145602203087147
},
"harness|hellaswag|10": {
"acc": 0.5950009958175663,
"acc_stderr": 0.0048988860806879276,
"acc_norm": 0.7800238996215894,
"acc_norm_stderr": 0.004133835786651186
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.030767394707808093,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.030767394707808093
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484865,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484865
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.039701582732351734,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.039701582732351734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.028406095057653326,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.028406095057653326
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03481285338232963,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03481285338232963
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240634,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240634
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145658,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145658
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6678899082568808,
"acc_stderr": 0.02019268298542333,
"acc_norm": 0.6678899082568808,
"acc_norm_stderr": 0.02019268298542333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03214952147802751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03214952147802751
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.03364487286088299,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.03364487286088299
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105296,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.029872577708891186,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.029872577708891186
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6628352490421456,
"acc_stderr": 0.01690520742080355,
"acc_norm": 0.6628352490421456,
"acc_norm_stderr": 0.01690520742080355
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382875,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28268156424581004,
"acc_stderr": 0.015060381730018096,
"acc_norm": 0.28268156424581004,
"acc_norm_stderr": 0.015060381730018096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.02809924077580956,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.02809924077580956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.027756535257347663,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.027756535257347663
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3624511082138201,
"acc_stderr": 0.012277512533252486,
"acc_norm": 0.3624511082138201,
"acc_norm_stderr": 0.012277512533252486
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4395424836601307,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.4395424836601307,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.0343751933733825,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.0343751933733825
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494884,
"mc2": 0.499579377079004,
"mc2_stderr": 0.015585935701840655
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
abdoelsayed/AMuRD | 2023-09-22T00:19:55.000Z | [
"arxiv:2309.09800",
"region:us"
] | abdoelsayed | null | null | null | 0 | 0 | # <a href="https://arxiv.org/abs/2309.09800">AMuRD</a>: Annotated Multilingual Receipts Dataset for Cross-lingual Key Information Extraction and Classification
by
Abdelrahman Abdallah,
Mahmoud Abdalla,
Mohamed Elkasaby,
Yasser Elbendary,
Adam Jatowt

## Abstract
> Key information extraction involves recognizing and extracting text from scanned receipts,
enabling retrieval of essential content, and organizing it into structured documents.
This paper presents a novel multilingual dataset for receipt extraction, addressing key challenges in information extraction and item classification.
The dataset comprises $47,720$ samples, including annotations for item names, attributes like (price, brand, etc.), and classification into $44$ product categories.
We introduce the InstructLLaMA approach, achieving an F1 score of $0.76$ and an accuracy of $0.68$ for key information extraction and item classification.
## Demo for our Instruct LLama
Explore our Instruct LLama system through our live demo:
[**Demo for our Instruct LLama**](http://18.188.209.98:5052/)
## Examples
| Example | Input | Class | Brand | Weight | Number of units | Size of units | Price | T.Price | Pack | Unit |
| ------- | ------------------------------------ | ---------------------- | -------------| --------- | ---------------- | --------------- | ------- | ------- | ------ | ----- |
| Example 1| `40.99 20.99 2 chunks sunshine` | Tins, Jars & Packets | sunshine | No Weight | 2 | No Size of units| 20.99 | 40.99 | علبة | No Unit |
| Example 2| `برسيل اتوماتيك جل روز 2.6` | Cleaning Supplies | برسيل | 2.6ل | 1 | No Size of units| No Price| No T.Price | عبوة | ل |
| Example 3| `regina Pasta penne 400g` | Rice, Pasta & Pulses | regina | 400g | 1 | No Size of units| No Price| No T.Price | كيس | g |
| Example 4| `10.00 400g Penne Pasta ElMaleka` | Rice, Pasta & Pulses | ElMaleka | 400g | 1 | No Size of units| 10 | 10 | كيس | g |
## Getting the code
To get started with the code and utilize the AMuRD dataset for your research or projects, you can clone this repository:
```bash
git clone https://github.com/yourusername/AMuRD.git
```
## Dependencies
## Reproducing the results
## Citation
Please consider to cite our paper:
```
@misc{abdallah2023amurd,
title={AMuRD: Annotated Multilingual Receipts Dataset for Cross-lingual Key Information Extraction and Classification},
author={Abdelrahman Abdallah and Mahmoud Abdalla and Mohamed Elkasaby and Yasser Elbendary and Adam Jatowt},
year={2023},
eprint={2309.09800},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
Note: The AMuRD Dataset can only be used for non-commercial research purposes.
For researchers who want to use the AMuRD database, please first fill
in this [Application Form](Application_Form/Application_Form_for_AMuRD.doc)
and send it via email to us ([m.abdallah@discoapp.ai](mailto:m.abdallah@discoapp.ai), [Yelbendary@discoapp.ai](mailto:Yelbendary@discoapp.ai), [abdoelsayed2016@gmail.com](mailto:abdoelsayed2016@gmail.com)).
|
open-llm-leaderboard/details_zarakiquemparte__kuchiki-l2-7b | 2023-09-22T00:22:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of zarakiquemparte/kuchiki-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/kuchiki-l2-7b](https://huggingface.co/zarakiquemparte/kuchiki-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__kuchiki-l2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T00:21:14.015290](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__kuchiki-l2-7b/blob/main/results_2023-09-22T00-21-14.015290.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48033670243463156,\n\
\ \"acc_stderr\": 0.03520401055785556,\n \"acc_norm\": 0.4837561263577122,\n\
\ \"acc_norm_stderr\": 0.03519011002394659,\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.49876664192397546,\n\
\ \"mc2_stderr\": 0.015614995245906657\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5255972696245734,\n \"acc_stderr\": 0.014592230885298962,\n\
\ \"acc_norm\": 0.5435153583617748,\n \"acc_norm_stderr\": 0.01455594976049644\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6005775741884087,\n\
\ \"acc_stderr\": 0.004887787255353492,\n \"acc_norm\": 0.7844054969129656,\n\
\ \"acc_norm_stderr\": 0.004103936879526263\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.03077090076385131,\n\
\ \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.03077090076385131\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237657,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237657\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.02351729433596328,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02351729433596328\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5096774193548387,\n\
\ \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.5096774193548387,\n\
\ \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165634,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165634\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.03292296639155141,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.03292296639155141\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240637,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240637\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230175,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230175\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478466,\n\
\ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478466\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6678899082568808,\n \"acc_stderr\": 0.020192682985423337,\n \"\
acc_norm\": 0.6678899082568808,\n \"acc_norm_stderr\": 0.020192682985423337\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402545,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402545\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6421568627450981,\n \"acc_stderr\": 0.03364487286088299,\n \"\
acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.03364487286088299\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6540084388185654,\n \"acc_stderr\": 0.03096481058878671,\n \
\ \"acc_norm\": 0.6540084388185654,\n \"acc_norm_stderr\": 0.03096481058878671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.02987257770889119,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.02987257770889119\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6602809706257982,\n\
\ \"acc_stderr\": 0.01693639411430165,\n \"acc_norm\": 0.6602809706257982,\n\
\ \"acc_norm_stderr\": 0.01693639411430165\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3005586592178771,\n\
\ \"acc_stderr\": 0.015334566806251164,\n \"acc_norm\": 0.3005586592178771,\n\
\ \"acc_norm_stderr\": 0.015334566806251164\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.028043399858210624,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.028043399858210624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.0286638201471995,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.0286638201471995\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3546284224250326,\n\
\ \"acc_stderr\": 0.012218576439090169,\n \"acc_norm\": 0.3546284224250326,\n\
\ \"acc_norm_stderr\": 0.012218576439090169\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.44607843137254904,\n \"acc_stderr\": 0.020109864547181357,\n \
\ \"acc_norm\": 0.44607843137254904,\n \"acc_norm_stderr\": 0.020109864547181357\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5061224489795918,\n \"acc_stderr\": 0.03200682020163908,\n\
\ \"acc_norm\": 0.5061224489795918,\n \"acc_norm_stderr\": 0.03200682020163908\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.034104105654953004,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.034104105654953004\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.49876664192397546,\n\
\ \"mc2_stderr\": 0.015614995245906657\n }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/kuchiki-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|arc:challenge|25_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hellaswag|10_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-21-14.015290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-21-14.015290.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T00-21-14.015290.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T00-21-14.015290.parquet'
- config_name: results
data_files:
- split: 2023_09_22T00_21_14.015290
path:
- results_2023-09-22T00-21-14.015290.parquet
- split: latest
path:
- results_2023-09-22T00-21-14.015290.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/kuchiki-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/kuchiki-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/kuchiki-l2-7b](https://huggingface.co/zarakiquemparte/kuchiki-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__kuchiki-l2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T00:21:14.015290](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__kuchiki-l2-7b/blob/main/results_2023-09-22T00-21-14.015290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48033670243463156,
"acc_stderr": 0.03520401055785556,
"acc_norm": 0.4837561263577122,
"acc_norm_stderr": 0.03519011002394659,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.49876664192397546,
"mc2_stderr": 0.015614995245906657
},
"harness|arc:challenge|25": {
"acc": 0.5255972696245734,
"acc_stderr": 0.014592230885298962,
"acc_norm": 0.5435153583617748,
"acc_norm_stderr": 0.01455594976049644
},
"harness|hellaswag|10": {
"acc": 0.6005775741884087,
"acc_stderr": 0.004887787255353492,
"acc_norm": 0.7844054969129656,
"acc_norm_stderr": 0.004103936879526263
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3881578947368421,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.3881578947368421,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5056603773584906,
"acc_stderr": 0.03077090076385131,
"acc_norm": 0.5056603773584906,
"acc_norm_stderr": 0.03077090076385131
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237657,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237657
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02351729433596328,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02351729433596328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5096774193548387,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.5096774193548387,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165634,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165634
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.03292296639155141,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.03292296639155141
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240637,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240637
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230175,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230175
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6678899082568808,
"acc_stderr": 0.020192682985423337,
"acc_norm": 0.6678899082568808,
"acc_norm_stderr": 0.020192682985423337
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402545,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402545
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.03364487286088299,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.03364487286088299
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6540084388185654,
"acc_stderr": 0.03096481058878671,
"acc_norm": 0.6540084388185654,
"acc_norm_stderr": 0.03096481058878671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.02987257770889119,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.02987257770889119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6602809706257982,
"acc_stderr": 0.01693639411430165,
"acc_norm": 0.6602809706257982,
"acc_norm_stderr": 0.01693639411430165
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.02690290045866664,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.02690290045866664
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3005586592178771,
"acc_stderr": 0.015334566806251164,
"acc_norm": 0.3005586592178771,
"acc_norm_stderr": 0.015334566806251164
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.028043399858210624,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.028043399858210624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.0286638201471995,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.0286638201471995
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3546284224250326,
"acc_stderr": 0.012218576439090169,
"acc_norm": 0.3546284224250326,
"acc_norm_stderr": 0.012218576439090169
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.44607843137254904,
"acc_stderr": 0.020109864547181357,
"acc_norm": 0.44607843137254904,
"acc_norm_stderr": 0.020109864547181357
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5061224489795918,
"acc_stderr": 0.03200682020163908,
"acc_norm": 0.5061224489795918,
"acc_norm_stderr": 0.03200682020163908
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.034104105654953004,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.034104105654953004
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.49876664192397546,
"mc2_stderr": 0.015614995245906657
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.2-l2-7b | 2023-09-22T00:25:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of zarakiquemparte/zarafusionex-1.2-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/zarafusionex-1.2-l2-7b](https://huggingface.co/zarakiquemparte/zarafusionex-1.2-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.2-l2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T00:24:36.284847](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.2-l2-7b/blob/main/results_2023-09-22T00-24-36.284847.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5209455058979542,\n\
\ \"acc_stderr\": 0.03496096921096046,\n \"acc_norm\": 0.5247651002073963,\n\
\ \"acc_norm_stderr\": 0.034945136870235136,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5129341494547134,\n\
\ \"mc2_stderr\": 0.015326308140507998\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5349829351535836,\n \"acc_stderr\": 0.01457558392201967,\n\
\ \"acc_norm\": 0.5665529010238908,\n \"acc_norm_stderr\": 0.014481376224558902\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5977892850029874,\n\
\ \"acc_stderr\": 0.0048934189299182735,\n \"acc_norm\": 0.791575383389763,\n\
\ \"acc_norm_stderr\": 0.004053518524584593\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.040463368839782514,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.040463368839782514\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115208,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115208\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179326,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179326\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467383,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467383\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101803,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101803\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n\
\ \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n\
\ \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03358618145732522,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03358618145732522\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.03074890536390989,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.03074890536390989\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.032473902765696686,\n\
\ \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.032473902765696686\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7119266055045872,\n \"acc_stderr\": 0.019416445892636032,\n \"\
acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.019416445892636032\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\"\
: 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395592,\n \"\
acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395592\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.0426073515764456,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.0426073515764456\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.0432076780753667,\n \"acc_norm\"\
: 0.6611570247933884,\n \"acc_norm_stderr\": 0.0432076780753667\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.038956324641389366,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.038956324641389366\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n\
\ \"acc_stderr\": 0.016225017944770975,\n \"acc_norm\": 0.7100893997445722,\n\
\ \"acc_norm_stderr\": 0.016225017944770975\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.026756255129663762,\n\
\ \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.026756255129663762\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02845263998508801,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02845263998508801\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930477,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930477\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.0277012284685426,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.0277012284685426\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3767926988265971,\n\
\ \"acc_stderr\": 0.012376459593894402,\n \"acc_norm\": 0.3767926988265971,\n\
\ \"acc_norm_stderr\": 0.012376459593894402\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.030290619180485694,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.030290619180485694\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02021703065318646,\n \
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02021703065318646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348642,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348642\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5129341494547134,\n\
\ \"mc2_stderr\": 0.015326308140507998\n }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/zarafusionex-1.2-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|arc:challenge|25_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hellaswag|10_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-24-36.284847.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T00-24-36.284847.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T00-24-36.284847.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T00-24-36.284847.parquet'
- config_name: results
data_files:
- split: 2023_09_22T00_24_36.284847
path:
- results_2023-09-22T00-24-36.284847.parquet
- split: latest
path:
- results_2023-09-22T00-24-36.284847.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/zarafusionex-1.2-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/zarafusionex-1.2-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/zarafusionex-1.2-l2-7b](https://huggingface.co/zarakiquemparte/zarafusionex-1.2-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.2-l2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T00:24:36.284847](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarafusionex-1.2-l2-7b/blob/main/results_2023-09-22T00-24-36.284847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5209455058979542,
"acc_stderr": 0.03496096921096046,
"acc_norm": 0.5247651002073963,
"acc_norm_stderr": 0.034945136870235136,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5129341494547134,
"mc2_stderr": 0.015326308140507998
},
"harness|arc:challenge|25": {
"acc": 0.5349829351535836,
"acc_stderr": 0.01457558392201967,
"acc_norm": 0.5665529010238908,
"acc_norm_stderr": 0.014481376224558902
},
"harness|hellaswag|10": {
"acc": 0.5977892850029874,
"acc_stderr": 0.0048934189299182735,
"acc_norm": 0.791575383389763,
"acc_norm_stderr": 0.004053518524584593
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.040463368839782514,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.040463368839782514
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179326,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179326
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467383,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467383
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101803,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101803
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03358618145732522,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03358618145732522
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.03074890536390989,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.03074890536390989
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.49159663865546216,
"acc_stderr": 0.032473902765696686,
"acc_norm": 0.49159663865546216,
"acc_norm_stderr": 0.032473902765696686
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.019416445892636032,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.019416445892636032
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.0426073515764456,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.0426073515764456
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.0432076780753667,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.0432076780753667
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.038956324641389366,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.038956324641389366
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.016225017944770975,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.016225017944770975
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.026756255129663762,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.026756255129663762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930477,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930477
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.0277012284685426,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.0277012284685426
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3767926988265971,
"acc_stderr": 0.012376459593894402,
"acc_norm": 0.3767926988265971,
"acc_norm_stderr": 0.012376459593894402
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.030290619180485694,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.030290619180485694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02021703065318646,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02021703065318646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348642,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348642
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5129341494547134,
"mc2_stderr": 0.015326308140507998
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ricardosantoss/data_ste | 2023-09-22T00:35:42.000Z | [
"region:us"
] | ricardosantoss | null | null | null | 0 | 0 | Entry not found |
viniigarena1/data1 | 2023-09-22T01:00:09.000Z | [
"license:other",
"region:us"
] | viniigarena1 | null | null | null | 0 | 0 | ---
license: other
---
|
riquinho21/ep1ep2 | 2023-09-22T01:09:31.000Z | [
"license:other",
"region:us"
] | riquinho21 | null | null | null | 0 | 0 | ---
license: other
---
|
dongyoung4091/shp-generated_flan_t5_large_flan_t5_base_zeroshot | 2023-09-23T00:41:46.000Z | [
"region:us"
] | dongyoung4091 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: zeroshot_helpfulness
dtype: float64
- name: zeroshot_specificity
dtype: float64
- name: zeroshot_intent
dtype: float64
- name: zeroshot_factuality
dtype: float64
- name: zeroshot_easy-to-understand
dtype: float64
- name: zeroshot_relevance
dtype: float64
- name: zeroshot_readability
dtype: float64
- name: zeroshot_enough-detail
dtype: float64
- name: 'zeroshot_biased:'
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences
dtype: float64
- name: zeroshot_repetetive
dtype: float64
- name: zeroshot_fail-to-consider-context
dtype: float64
- name: zeroshot_too-long
dtype: float64
splits:
- name: train
num_bytes: 29493865
num_examples: 25600
download_size: 0
dataset_size: 29493865
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "shp-generated_flan_t5_large_flan_t5_base_zeroshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1 | 2023-09-22T01:36:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Xwin-LM/Xwin-LM-7B-V0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Xwin-LM/Xwin-LM-7B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T01:35:00.215271](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1/blob/main/results_2023-09-22T01-35-00.215271.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5020836702122426,\n\
\ \"acc_stderr\": 0.03513199545097331,\n \"acc_norm\": 0.5058805566322531,\n\
\ \"acc_norm_stderr\": 0.03511606626318198,\n \"mc1\": 0.3292533659730722,\n\
\ \"mc1_stderr\": 0.01645126444006824,\n \"mc2\": 0.47887897889684966,\n\
\ \"mc2_stderr\": 0.015477268229074508\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5273037542662116,\n \"acc_stderr\": 0.014589589101985994,\n\
\ \"acc_norm\": 0.5656996587030717,\n \"acc_norm_stderr\": 0.014484703048857359\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6083449512049393,\n\
\ \"acc_stderr\": 0.0048712266293464,\n \"acc_norm\": 0.7939653455486955,\n\
\ \"acc_norm_stderr\": 0.0040362906027860595\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.040260970832965585,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.040260970832965585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307712,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307712\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5516129032258065,\n \"acc_stderr\": 0.02829205683011273,\n \"\
acc_norm\": 0.5516129032258065,\n \"acc_norm_stderr\": 0.02829205683011273\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n \"\
acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016339,\n \"\
acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016339\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041153,\n\
\ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041153\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46923076923076923,\n \"acc_stderr\": 0.02530295889085015,\n\
\ \"acc_norm\": 0.46923076923076923,\n \"acc_norm_stderr\": 0.02530295889085015\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945287,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945287\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4579831932773109,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.4579831932773109,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969655,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969655\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6880733944954128,\n\
\ \"acc_stderr\": 0.019862967976707245,\n \"acc_norm\": 0.6880733944954128,\n\
\ \"acc_norm_stderr\": 0.019862967976707245\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.032757734861009996,\n\
\ \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6470588235294118,\n \"acc_stderr\": 0.03354092437591519,\n \"\
acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03354092437591519\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138598,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138598\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.038818912133343826,\n\
\ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.038818912133343826\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097173,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097173\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\
\ \"acc_stderr\": 0.029996951858349472,\n \"acc_norm\": 0.7008547008547008,\n\
\ \"acc_norm_stderr\": 0.029996951858349472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6781609195402298,\n\
\ \"acc_stderr\": 0.0167063814150579,\n \"acc_norm\": 0.6781609195402298,\n\
\ \"acc_norm_stderr\": 0.0167063814150579\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.546242774566474,\n \"acc_stderr\": 0.02680372058320618,\n\
\ \"acc_norm\": 0.546242774566474,\n \"acc_norm_stderr\": 0.02680372058320618\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\
\ \"acc_stderr\": 0.014987325439963546,\n \"acc_norm\": 0.2782122905027933,\n\
\ \"acc_norm_stderr\": 0.014987325439963546\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.02852638345214264,\n\
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.02852638345214264\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.028099240775809563,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.028099240775809563\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.027701228468542602,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.027701228468542602\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.379400260756193,\n\
\ \"acc_stderr\": 0.0123932020298254,\n \"acc_norm\": 0.379400260756193,\n\
\ \"acc_norm_stderr\": 0.0123932020298254\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.03035230339535196,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.03035230339535196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4722222222222222,\n \"acc_stderr\": 0.020196594933541194,\n \
\ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.020196594933541194\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935558,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3292533659730722,\n\
\ \"mc1_stderr\": 0.01645126444006824,\n \"mc2\": 0.47887897889684966,\n\
\ \"mc2_stderr\": 0.015477268229074508\n }\n}\n```"
repo_url: https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|arc:challenge|25_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hellaswag|10_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T01-35-00.215271.parquet'
- config_name: results
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- results_2023-09-22T01-35-00.215271.parquet
- split: latest
path:
- results_2023-09-22T01-35-00.215271.parquet
---
# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-7B-V0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Xwin-LM/Xwin-LM-7B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T01:35:00.215271](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1/blob/main/results_2023-09-22T01-35-00.215271.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5020836702122426,
"acc_stderr": 0.03513199545097331,
"acc_norm": 0.5058805566322531,
"acc_norm_stderr": 0.03511606626318198,
"mc1": 0.3292533659730722,
"mc1_stderr": 0.01645126444006824,
"mc2": 0.47887897889684966,
"mc2_stderr": 0.015477268229074508
},
"harness|arc:challenge|25": {
"acc": 0.5273037542662116,
"acc_stderr": 0.014589589101985994,
"acc_norm": 0.5656996587030717,
"acc_norm_stderr": 0.014484703048857359
},
"harness|hellaswag|10": {
"acc": 0.6083449512049393,
"acc_stderr": 0.0048712266293464,
"acc_norm": 0.7939653455486955,
"acc_norm_stderr": 0.0040362906027860595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.040260970832965585,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.040260970832965585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283648,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283648
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307712,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307712
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5516129032258065,
"acc_stderr": 0.02829205683011273,
"acc_norm": 0.5516129032258065,
"acc_norm_stderr": 0.02829205683011273
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.03464881675016339,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.03464881675016339
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.03221024508041153,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.03221024508041153
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46923076923076923,
"acc_stderr": 0.02530295889085015,
"acc_norm": 0.46923076923076923,
"acc_norm_stderr": 0.02530295889085015
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945287,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945287
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4579831932773109,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.4579831932773109,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969655,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969655
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6880733944954128,
"acc_stderr": 0.019862967976707245,
"acc_norm": 0.6880733944954128,
"acc_norm_stderr": 0.019862967976707245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.03354092437591519,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.03354092437591519
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138598,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138598
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.038818912133343826,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.038818912133343826
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.04721188506097173,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.04721188506097173
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.029996951858349472,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.029996951858349472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6781609195402298,
"acc_stderr": 0.0167063814150579,
"acc_norm": 0.6781609195402298,
"acc_norm_stderr": 0.0167063814150579
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.546242774566474,
"acc_stderr": 0.02680372058320618,
"acc_norm": 0.546242774566474,
"acc_norm_stderr": 0.02680372058320618
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.014987325439963546,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.014987325439963546
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.02852638345214264,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.02852638345214264
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.028099240775809563,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.028099240775809563
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.027701228468542602,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.027701228468542602
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.379400260756193,
"acc_stderr": 0.0123932020298254,
"acc_norm": 0.379400260756193,
"acc_norm_stderr": 0.0123932020298254
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.03035230339535196,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.03035230339535196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.020196594933541194,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.020196594933541194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.03125127591089165,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.03125127591089165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935558,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3292533659730722,
"mc1_stderr": 0.01645126444006824,
"mc2": 0.47887897889684966,
"mc2_stderr": 0.015477268229074508
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dongyoung4091/shp_with_features_20k_flan_t5_large_flan_t5_large_zeroshot | 2023-09-22T02:12:52.000Z | [
"region:us"
] | dongyoung4091 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: post_id
dtype: string
- name: domain
dtype: string
- name: upvote_ratio
dtype: float64
- name: history
dtype: string
- name: c_root_id_A
dtype: string
- name: c_root_id_B
dtype: string
- name: created_at_utc_A
dtype: int64
- name: created_at_utc_B
dtype: int64
- name: score_A
dtype: int64
- name: score_B
dtype: int64
- name: human_ref_A
dtype: string
- name: human_ref_B
dtype: string
- name: labels
dtype: int64
- name: seconds_difference
dtype: float64
- name: score_ratio
dtype: float64
- name: helpfulness_A
dtype: float64
- name: helpfulness_B
dtype: float64
- name: specificity_A
dtype: float64
- name: specificity_B
dtype: float64
- name: intent_A
dtype: float64
- name: intent_B
dtype: float64
- name: factuality_A
dtype: float64
- name: factuality_B
dtype: float64
- name: easy-to-understand_A
dtype: float64
- name: easy-to-understand_B
dtype: float64
- name: relevance_A
dtype: float64
- name: relevance_B
dtype: float64
- name: readability_A
dtype: float64
- name: readability_B
dtype: float64
- name: enough-detail_A
dtype: float64
- name: enough-detail_B
dtype: float64
- name: biased:_A
dtype: float64
- name: biased:_B
dtype: float64
- name: fail-to-consider-individual-preferences_A
dtype: float64
- name: fail-to-consider-individual-preferences_B
dtype: float64
- name: repetetive_A
dtype: float64
- name: repetetive_B
dtype: float64
- name: fail-to-consider-context_A
dtype: float64
- name: fail-to-consider-context_B
dtype: float64
- name: too-long_A
dtype: float64
- name: too-long_B
dtype: float64
- name: __index_level_0__
dtype: int64
- name: log_score_A
dtype: float64
- name: log_score_B
dtype: float64
- name: zeroshot_helpfulness_A
dtype: float64
- name: zeroshot_helpfulness_B
dtype: float64
- name: zeroshot_specificity_A
dtype: float64
- name: zeroshot_specificity_B
dtype: float64
- name: zeroshot_intent_A
dtype: float64
- name: zeroshot_intent_B
dtype: float64
- name: zeroshot_factuality_A
dtype: float64
- name: zeroshot_factuality_B
dtype: float64
- name: zeroshot_easy-to-understand_A
dtype: float64
- name: zeroshot_easy-to-understand_B
dtype: float64
- name: zeroshot_relevance_A
dtype: float64
- name: zeroshot_relevance_B
dtype: float64
- name: zeroshot_readability_A
dtype: float64
- name: zeroshot_readability_B
dtype: float64
- name: zeroshot_enough-detail_A
dtype: float64
- name: zeroshot_enough-detail_B
dtype: float64
- name: zeroshot_biased:_A
dtype: float64
- name: zeroshot_biased:_B
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences_A
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences_B
dtype: float64
- name: zeroshot_repetetive_A
dtype: float64
- name: zeroshot_repetetive_B
dtype: float64
- name: zeroshot_fail-to-consider-context_A
dtype: float64
- name: zeroshot_fail-to-consider-context_B
dtype: float64
- name: zeroshot_too-long_A
dtype: float64
- name: zeroshot_too-long_B
dtype: float64
splits:
- name: train
num_bytes: 22674534
num_examples: 9459
- name: test
num_bytes: 22627412
num_examples: 9459
download_size: 12124964
dataset_size: 45301946
---
# Dataset Card for "shp_with_features_20k_flan_t5_large_flan_t5_large_zeroshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dongyoung4091/shp_with_features_20k_flan_t5_large_flan_t5_base_zeroshot | 2023-09-23T00:41:52.000Z | [
"region:us"
] | dongyoung4091 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: post_id
dtype: string
- name: domain
dtype: string
- name: upvote_ratio
dtype: float64
- name: history
dtype: string
- name: c_root_id_A
dtype: string
- name: c_root_id_B
dtype: string
- name: created_at_utc_A
dtype: int64
- name: created_at_utc_B
dtype: int64
- name: score_A
dtype: int64
- name: score_B
dtype: int64
- name: human_ref_A
dtype: string
- name: human_ref_B
dtype: string
- name: labels
dtype: int64
- name: seconds_difference
dtype: float64
- name: score_ratio
dtype: float64
- name: helpfulness_A
dtype: float64
- name: helpfulness_B
dtype: float64
- name: specificity_A
dtype: float64
- name: specificity_B
dtype: float64
- name: intent_A
dtype: float64
- name: intent_B
dtype: float64
- name: factuality_A
dtype: float64
- name: factuality_B
dtype: float64
- name: easy-to-understand_A
dtype: float64
- name: easy-to-understand_B
dtype: float64
- name: relevance_A
dtype: float64
- name: relevance_B
dtype: float64
- name: readability_A
dtype: float64
- name: readability_B
dtype: float64
- name: enough-detail_A
dtype: float64
- name: enough-detail_B
dtype: float64
- name: biased:_A
dtype: float64
- name: biased:_B
dtype: float64
- name: fail-to-consider-individual-preferences_A
dtype: float64
- name: fail-to-consider-individual-preferences_B
dtype: float64
- name: repetetive_A
dtype: float64
- name: repetetive_B
dtype: float64
- name: fail-to-consider-context_A
dtype: float64
- name: fail-to-consider-context_B
dtype: float64
- name: too-long_A
dtype: float64
- name: too-long_B
dtype: float64
- name: __index_level_0__
dtype: int64
- name: log_score_A
dtype: float64
- name: log_score_B
dtype: float64
- name: zeroshot_helpfulness_A
dtype: float64
- name: zeroshot_helpfulness_B
dtype: float64
- name: zeroshot_specificity_A
dtype: float64
- name: zeroshot_specificity_B
dtype: float64
- name: zeroshot_intent_A
dtype: float64
- name: zeroshot_intent_B
dtype: float64
- name: zeroshot_factuality_A
dtype: float64
- name: zeroshot_factuality_B
dtype: float64
- name: zeroshot_easy-to-understand_A
dtype: float64
- name: zeroshot_easy-to-understand_B
dtype: float64
- name: zeroshot_relevance_A
dtype: float64
- name: zeroshot_relevance_B
dtype: float64
- name: zeroshot_readability_A
dtype: float64
- name: zeroshot_readability_B
dtype: float64
- name: zeroshot_enough-detail_A
dtype: float64
- name: zeroshot_enough-detail_B
dtype: float64
- name: zeroshot_biased:_A
dtype: float64
- name: zeroshot_biased:_B
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences_A
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences_B
dtype: float64
- name: zeroshot_repetetive_A
dtype: float64
- name: zeroshot_repetetive_B
dtype: float64
- name: zeroshot_fail-to-consider-context_A
dtype: float64
- name: zeroshot_fail-to-consider-context_B
dtype: float64
- name: zeroshot_too-long_A
dtype: float64
- name: zeroshot_too-long_B
dtype: float64
splits:
- name: train
num_bytes: 22674534
num_examples: 9459
- name: test
num_bytes: 22627412
num_examples: 9459
download_size: 0
dataset_size: 45301946
---
# Dataset Card for "shp_with_features_20k_flan_t5_large_flan_t5_base_zeroshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GHowe/riffusion-add-on-dataset | 2023-09-22T01:49:22.000Z | [
"region:us"
] | GHowe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Xwin-LM__Xwin-LM-13B-V0.1 | 2023-09-22T02:02:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Xwin-LM/Xwin-LM-13B-V0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Xwin-LM/Xwin-LM-13B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-13B-V0.1) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xwin-LM__Xwin-LM-13B-V0.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T02:00:45.467077](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-13B-V0.1/blob/main/results_2023-09-22T02-00-45.467077.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.566903292760224,\n\
\ \"acc_stderr\": 0.03417586847288058,\n \"acc_norm\": 0.5707582939804706,\n\
\ \"acc_norm_stderr\": 0.034154352435207355,\n \"mc1\": 0.32068543451652387,\n\
\ \"mc1_stderr\": 0.016339170373280906,\n \"mc2\": 0.4595945101063146,\n\
\ \"mc2_stderr\": 0.01565952149527769\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n\
\ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893446\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6381198964349731,\n\
\ \"acc_stderr\": 0.004795622757327147,\n \"acc_norm\": 0.8280223063134834,\n\
\ \"acc_norm_stderr\": 0.0037658983649388657\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.02977308271331987,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.02977308271331987\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.037724468575180276,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.037724468575180276\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572267,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572267\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860688,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860688\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448656,\n\
\ \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448656\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.01850814360254782,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.01850814360254782\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n\
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835795,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835795\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.02572280220089581,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.02572280220089581\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4044692737430168,\n\
\ \"acc_stderr\": 0.016414440917293147,\n \"acc_norm\": 0.4044692737430168,\n\
\ \"acc_norm_stderr\": 0.016414440917293147\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301757,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301757\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409818,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409818\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144376,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144376\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n\
\ \"acc_stderr\": 0.012647695889547235,\n \"acc_norm\": 0.43089960886571055,\n\
\ \"acc_norm_stderr\": 0.012647695889547235\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5849673202614379,\n \"acc_stderr\": 0.01993362777685742,\n \
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.01993362777685742\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683913,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683913\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32068543451652387,\n\
\ \"mc1_stderr\": 0.016339170373280906,\n \"mc2\": 0.4595945101063146,\n\
\ \"mc2_stderr\": 0.01565952149527769\n }\n}\n```"
repo_url: https://huggingface.co/Xwin-LM/Xwin-LM-13B-V0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|arc:challenge|25_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hellaswag|10_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-00-45.467077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-00-45.467077.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T02-00-45.467077.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T02-00-45.467077.parquet'
- config_name: results
data_files:
- split: 2023_09_22T02_00_45.467077
path:
- results_2023-09-22T02-00-45.467077.parquet
- split: latest
path:
- results_2023-09-22T02-00-45.467077.parquet
---
# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-13B-V0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Xwin-LM/Xwin-LM-13B-V0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Xwin-LM/Xwin-LM-13B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-13B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xwin-LM__Xwin-LM-13B-V0.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T02:00:45.467077](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-13B-V0.1/blob/main/results_2023-09-22T02-00-45.467077.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.566903292760224,
"acc_stderr": 0.03417586847288058,
"acc_norm": 0.5707582939804706,
"acc_norm_stderr": 0.034154352435207355,
"mc1": 0.32068543451652387,
"mc1_stderr": 0.016339170373280906,
"mc2": 0.4595945101063146,
"mc2_stderr": 0.01565952149527769
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893446
},
"harness|hellaswag|10": {
"acc": 0.6381198964349731,
"acc_stderr": 0.004795622757327147,
"acc_norm": 0.8280223063134834,
"acc_norm_stderr": 0.0037658983649388657
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.02977308271331987,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.02977308271331987
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.037724468575180276,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.037724468575180276
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572267,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572267
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860688,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860688
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.025317649726448656,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.025317649726448656
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.01850814360254782,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.01850814360254782
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835795,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835795
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.02572280220089581,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.02572280220089581
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4044692737430168,
"acc_stderr": 0.016414440917293147,
"acc_norm": 0.4044692737430168,
"acc_norm_stderr": 0.016414440917293147
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388852,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388852
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301757,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301757
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409818,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409818
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144376,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144376
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547235,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.01993362777685742,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.01993362777685742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683913,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32068543451652387,
"mc1_stderr": 0.016339170373280906,
"mc2": 0.4595945101063146,
"mc2_stderr": 0.01565952149527769
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bill202307/3gpp | 2023-09-22T02:06:46.000Z | [
"region:us"
] | bill202307 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_euclaise__falcon_1b_stage3 | 2023-09-22T02:14:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of euclaise/falcon_1b_stage3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [euclaise/falcon_1b_stage3](https://huggingface.co/euclaise/falcon_1b_stage3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__falcon_1b_stage3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T02:12:44.101824](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage3/blob/main/results_2023-09-22T02-12-44.101824.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2547645754121847,\n\
\ \"acc_stderr\": 0.03145979569651152,\n \"acc_norm\": 0.25732002520368746,\n\
\ \"acc_norm_stderr\": 0.0314656741298122,\n \"mc1\": 0.211750305997552,\n\
\ \"mc1_stderr\": 0.014302068353925612,\n \"mc2\": 0.379233457445528,\n\
\ \"mc2_stderr\": 0.015326254345045762\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3054607508532423,\n \"acc_stderr\": 0.0134600804780025,\n\
\ \"acc_norm\": 0.3310580204778157,\n \"acc_norm_stderr\": 0.01375206241981783\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41565425214100776,\n\
\ \"acc_stderr\": 0.004918272352137551,\n \"acc_norm\": 0.5408285202150966,\n\
\ \"acc_norm_stderr\": 0.004973117975062485\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.029674167520101446,\n\
\ \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.029674167520101446\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.14,\n\
\ \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.14,\n \
\ \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.026480357179895678,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.026480357179895678\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.03745554791462457,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.03745554791462457\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\
\ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.27167630057803466,\n\
\ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149352,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149352\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.028185441301234095,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.028185441301234095\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.037800192304380135,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.037800192304380135\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113946,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113946\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011745,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011745\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.267741935483871,\n \"acc_stderr\": 0.025189006660212385,\n \"\
acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.025189006660212385\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n \"\
acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.03239637046735703,\n\
\ \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.03239637046735703\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.022139081103971527,\n\
\ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.022139081103971527\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182378,\n \
\ \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182378\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978082,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978082\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.20733944954128442,\n \"acc_stderr\": 0.017381415563608674,\n \"\
acc_norm\": 0.20733944954128442,\n \"acc_norm_stderr\": 0.017381415563608674\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2037037037037037,\n \"acc_stderr\": 0.02746740180405799,\n \"\
acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.02746740180405799\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693247,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693247\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n\
\ \"acc_stderr\": 0.02758406660220826,\n \"acc_norm\": 0.21524663677130046,\n\
\ \"acc_norm_stderr\": 0.02758406660220826\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.20512820512820512,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27458492975734355,\n\
\ \"acc_stderr\": 0.015959829933084032,\n \"acc_norm\": 0.27458492975734355,\n\
\ \"acc_norm_stderr\": 0.015959829933084032\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.024027745155265012,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.024027745155265012\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.33762057877813506,\n\
\ \"acc_stderr\": 0.02685882587948855,\n \"acc_norm\": 0.33762057877813506,\n\
\ \"acc_norm_stderr\": 0.02685882587948855\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.27469135802469136,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.27469135802469136,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180844,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24967405475880053,\n\
\ \"acc_stderr\": 0.011054538377832318,\n \"acc_norm\": 0.24967405475880053,\n\
\ \"acc_norm_stderr\": 0.011054538377832318\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2536764705882353,\n \"acc_stderr\": 0.026431329870789538,\n\
\ \"acc_norm\": 0.2536764705882353,\n \"acc_norm_stderr\": 0.026431329870789538\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2369281045751634,\n \"acc_stderr\": 0.017201662169789782,\n \
\ \"acc_norm\": 0.2369281045751634,\n \"acc_norm_stderr\": 0.017201662169789782\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721377,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721377\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.30845771144278605,\n\
\ \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.30845771144278605,\n\
\ \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.24096385542168675,\n\
\ \"acc_stderr\": 0.03329394119073529,\n \"acc_norm\": 0.24096385542168675,\n\
\ \"acc_norm_stderr\": 0.03329394119073529\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.211750305997552,\n\
\ \"mc1_stderr\": 0.014302068353925612,\n \"mc2\": 0.379233457445528,\n\
\ \"mc2_stderr\": 0.015326254345045762\n }\n}\n```"
repo_url: https://huggingface.co/euclaise/falcon_1b_stage3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|arc:challenge|25_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hellaswag|10_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-12-44.101824.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-12-44.101824.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T02-12-44.101824.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T02-12-44.101824.parquet'
- config_name: results
data_files:
- split: 2023_09_22T02_12_44.101824
path:
- results_2023-09-22T02-12-44.101824.parquet
- split: latest
path:
- results_2023-09-22T02-12-44.101824.parquet
---
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/euclaise/falcon_1b_stage3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage3](https://huggingface.co/euclaise/falcon_1b_stage3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_euclaise__falcon_1b_stage3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T02:12:44.101824](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage3/blob/main/results_2023-09-22T02-12-44.101824.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2547645754121847,
"acc_stderr": 0.03145979569651152,
"acc_norm": 0.25732002520368746,
"acc_norm_stderr": 0.0314656741298122,
"mc1": 0.211750305997552,
"mc1_stderr": 0.014302068353925612,
"mc2": 0.379233457445528,
"mc2_stderr": 0.015326254345045762
},
"harness|arc:challenge|25": {
"acc": 0.3054607508532423,
"acc_stderr": 0.0134600804780025,
"acc_norm": 0.3310580204778157,
"acc_norm_stderr": 0.01375206241981783
},
"harness|hellaswag|10": {
"acc": 0.41565425214100776,
"acc_stderr": 0.004918272352137551,
"acc_norm": 0.5408285202150966,
"acc_norm_stderr": 0.004973117975062485
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.029674167520101446,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.029674167520101446
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.14,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.14,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.026480357179895678,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.026480357179895678
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03745554791462457,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03745554791462457
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149352,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149352
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.028185441301234095,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.028185441301234095
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.037800192304380135,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.037800192304380135
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113946,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113946
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011745,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.03239637046735703,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.03239637046735703
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.022139081103971527,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.022139081103971527
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.02519575225182378,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.02519575225182378
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978082,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978082
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20733944954128442,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.20733944954128442,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.02746740180405799,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.02746740180405799
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693247,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693247
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.02758406660220826,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.02758406660220826
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27458492975734355,
"acc_stderr": 0.015959829933084032,
"acc_norm": 0.27458492975734355,
"acc_norm_stderr": 0.015959829933084032
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.33762057877813506,
"acc_stderr": 0.02685882587948855,
"acc_norm": 0.33762057877813506,
"acc_norm_stderr": 0.02685882587948855
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.27469135802469136,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.27469135802469136,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180844,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24967405475880053,
"acc_stderr": 0.011054538377832318,
"acc_norm": 0.24967405475880053,
"acc_norm_stderr": 0.011054538377832318
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2536764705882353,
"acc_stderr": 0.026431329870789538,
"acc_norm": 0.2536764705882353,
"acc_norm_stderr": 0.026431329870789538
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2369281045751634,
"acc_stderr": 0.017201662169789782,
"acc_norm": 0.2369281045751634,
"acc_norm_stderr": 0.017201662169789782
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721377,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721377
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.30845771144278605,
"acc_stderr": 0.03265819588512699,
"acc_norm": 0.30845771144278605,
"acc_norm_stderr": 0.03265819588512699
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-virology|5": {
"acc": 0.24096385542168675,
"acc_stderr": 0.03329394119073529,
"acc_norm": 0.24096385542168675,
"acc_norm_stderr": 0.03329394119073529
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.211750305997552,
"mc1_stderr": 0.014302068353925612,
"mc2": 0.379233457445528,
"mc2_stderr": 0.015326254345045762
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_NoIdeaLand__test-3k-mx | 2023-09-22T02:21:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NoIdeaLand/test-3k-mx
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NoIdeaLand/test-3k-mx](https://huggingface.co/NoIdeaLand/test-3k-mx) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NoIdeaLand__test-3k-mx\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T02:20:18.679270](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-3k-mx/blob/main/results_2023-09-22T02-20-18.679270.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25937377998421673,\n\
\ \"acc_stderr\": 0.03158826848264918,\n \"acc_norm\": 0.263032034220802,\n\
\ \"acc_norm_stderr\": 0.031588822884227444,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.4093188700877857,\n\
\ \"mc2_stderr\": 0.014339231042407396\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3438566552901024,\n \"acc_stderr\": 0.013880644570156201,\n\
\ \"acc_norm\": 0.38054607508532423,\n \"acc_norm_stderr\": 0.014188277712349822\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48516231826329415,\n\
\ \"acc_stderr\": 0.004987583858923224,\n \"acc_norm\": 0.6643098984266083,\n\
\ \"acc_norm_stderr\": 0.004712660409846823\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.034065420585026505,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.034065420585026505\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241235,\n\
\ \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730445,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730445\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.029644006577009618,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.029644006577009618\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.035058596825972656,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.035058596825972656\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.037528339580033376,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.037528339580033376\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21428571428571427,\n \"acc_stderr\": 0.02113285918275445,\n \"\
acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02113285918275445\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276863,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276863\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2032258064516129,\n \"acc_stderr\": 0.022891687984554952,\n \"\
acc_norm\": 0.2032258064516129,\n \"acc_norm_stderr\": 0.022891687984554952\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.21182266009852216,\n \"acc_stderr\": 0.028748983689941065,\n \"\
acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.028748983689941065\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.031821550509166484,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.031821550509166484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28717948717948716,\n \"acc_stderr\": 0.022939925418530616,\n\
\ \"acc_norm\": 0.28717948717948716,\n \"acc_norm_stderr\": 0.022939925418530616\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.02720537153827947,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.02720537153827947\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1981651376146789,\n \"acc_stderr\": 0.017090573804217885,\n \"\
acc_norm\": 0.1981651376146789,\n \"acc_norm_stderr\": 0.017090573804217885\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1712962962962963,\n \"acc_stderr\": 0.02569534164382467,\n \"\
acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.02569534164382467\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3504273504273504,\n\
\ \"acc_stderr\": 0.03125610824421881,\n \"acc_norm\": 0.3504273504273504,\n\
\ \"acc_norm_stderr\": 0.03125610824421881\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n\
\ \"acc_stderr\": 0.01569600856380709,\n \"acc_norm\": 0.26053639846743293,\n\
\ \"acc_norm_stderr\": 0.01569600856380709\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044273,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044273\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961455,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961455\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2282958199356913,\n\
\ \"acc_stderr\": 0.023839303311398222,\n \"acc_norm\": 0.2282958199356913,\n\
\ \"acc_norm_stderr\": 0.023839303311398222\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005716,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2770534550195567,\n\
\ \"acc_stderr\": 0.011430462443719676,\n \"acc_norm\": 0.2770534550195567,\n\
\ \"acc_norm_stderr\": 0.011430462443719676\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541104,\n\
\ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541104\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.4093188700877857,\n\
\ \"mc2_stderr\": 0.014339231042407396\n }\n}\n```"
repo_url: https://huggingface.co/NoIdeaLand/test-3k-mx
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|arc:challenge|25_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hellaswag|10_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T02-20-18.679270.parquet'
- config_name: results
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- results_2023-09-22T02-20-18.679270.parquet
- split: latest
path:
- results_2023-09-22T02-20-18.679270.parquet
---
# Dataset Card for Evaluation run of NoIdeaLand/test-3k-mx
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NoIdeaLand/test-3k-mx
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NoIdeaLand/test-3k-mx](https://huggingface.co/NoIdeaLand/test-3k-mx) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NoIdeaLand__test-3k-mx",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T02:20:18.679270](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-3k-mx/blob/main/results_2023-09-22T02-20-18.679270.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25937377998421673,
"acc_stderr": 0.03158826848264918,
"acc_norm": 0.263032034220802,
"acc_norm_stderr": 0.031588822884227444,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871108,
"mc2": 0.4093188700877857,
"mc2_stderr": 0.014339231042407396
},
"harness|arc:challenge|25": {
"acc": 0.3438566552901024,
"acc_stderr": 0.013880644570156201,
"acc_norm": 0.38054607508532423,
"acc_norm_stderr": 0.014188277712349822
},
"harness|hellaswag|10": {
"acc": 0.48516231826329415,
"acc_stderr": 0.004987583858923224,
"acc_norm": 0.6643098984266083,
"acc_norm_stderr": 0.004712660409846823
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.034065420585026505,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.034065420585026505
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.026749899771241235,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.026749899771241235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730445,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730445
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.029644006577009618,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.029644006577009618
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.035058596825972656,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.035058596825972656
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.037528339580033376,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.037528339580033376
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02113285918275445,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02113285918275445
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276863,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276863
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2032258064516129,
"acc_stderr": 0.022891687984554952,
"acc_norm": 0.2032258064516129,
"acc_norm_stderr": 0.022891687984554952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.028748983689941065,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.028748983689941065
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.027772533334218977,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.027772533334218977
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.031821550509166484,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.031821550509166484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28717948717948716,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.28717948717948716,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1981651376146789,
"acc_stderr": 0.017090573804217885,
"acc_norm": 0.1981651376146789,
"acc_norm_stderr": 0.017090573804217885
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1712962962962963,
"acc_stderr": 0.02569534164382467,
"acc_norm": 0.1712962962962963,
"acc_norm_stderr": 0.02569534164382467
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3504273504273504,
"acc_stderr": 0.03125610824421881,
"acc_norm": 0.3504273504273504,
"acc_norm_stderr": 0.03125610824421881
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.01569600856380709,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.01569600856380709
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044273,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961455,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961455
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2282958199356913,
"acc_stderr": 0.023839303311398222,
"acc_norm": 0.2282958199356913,
"acc_norm_stderr": 0.023839303311398222
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005716,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2770534550195567,
"acc_stderr": 0.011430462443719676,
"acc_norm": 0.2770534550195567,
"acc_norm_stderr": 0.011430462443719676
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541104,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541104
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871108,
"mc2": 0.4093188700877857,
"mc2_stderr": 0.014339231042407396
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
qgyd2021/chinese_chitchat | 2023-09-22T08:39:11.000Z | [
"size_categories:100M<n<1B",
"language:zh",
"license:apache-2.0",
"chitchat",
"region:us"
] | qgyd2021 | null | @dataset{chinese_chitchat,
author = {Xing Tian},
title = {chinese_chitchat},
month = sep,
year = 2023,
publisher = {Xing Tian},
version = {1.0},
} | null | 0 | 0 | ---
license: apache-2.0
language:
- zh
tags:
- chitchat
size_categories:
- 100M<n<1B
---
## 中文闲聊数据集
role 的取值有: "unknown", "human", "assistant", 三种.
数据集从网上收集整理如下:
| 数据 | 原始数据/项目地址 | 样本个数 | 语料描述 | 替代数据下载地址 |
| :--- | :---: | :---: | :---: | :---: |
| ChatterBot | [ChatterBot](https://github.com/gunthercox/ChatterBot); [chatterbot-corpus](https://github.com/gunthercox/chatterbot-corpus) | 560 | 按类型分类,质量较高 | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao |
| douban | [Douban Conversation Corpus](https://github.com/MarkWuNLP/MultiTurnResponseSelection) | 352W | 来自北航和微软的paper, 噪音相对较少, 多轮(平均7.6轮) | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao |
| ptt | [PTT中文語料](https://github.com/zake7749/Gossiping-Chinese-Corpus) | 77W | 开源项目, 台湾PTT论坛八卦版, 繁体, 语料较生活化, 有噪音 | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao |
| qingyun | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao | 10W | 青云语料, 相对不错, 生活化 | |
| subtitle | [电视剧对白语料](https://github.com/aceimnorstuvwxz/dgk_lost_conv) | 274W | 来自爬取的电影和美剧的字幕, 有一些噪音, 不严谨的对话, 说话人无法对应起来, 多轮(平均5.3轮) | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao |
| tieba | [贴吧论坛回帖语料](https://pan.baidu.com/s/1mUknfwy1nhSM7XzH8xi7gQ); 密码:i4si | 232W | 多轮, 有噪音 | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao |
| weibo | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao | 443W | 来自华为的paper | |
| xiaohuangji | [小黄鸡语料](https://github.com/candlewill/Dialog_Corpus) | 45W | 原人人网项目语料, 有一些不雅对话, 少量噪音 | [阿里云盘](https://www.aliyundrive.com/s/qXBdAYtz5j5); 提取码: 81ao |
<details>
<summary>参考的数据来源,展开查看</summary>
<pre>
<code>
https://github.com/codemayq/chinese_chatbot_corpus
https://github.com/yangjianxin1/GPT2-chitchat
</code>
</pre>
</details>
|
bossdjbr/kekeuu | 2023-09-22T03:09:42.000Z | [
"region:us"
] | bossdjbr | null | null | null | 0 | 0 | Entry not found |
CyberHarem/nagatoro_hayase_donttoywithmemissnagatoro | 2023-09-22T02:47:57.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Nagatoro Hayase
This is the dataset of Nagatoro Hayase, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 650 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 650 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 650 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 650 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/wahira_nagomi_akibameidosensou | 2023-09-22T02:58:29.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Wahira Nagomi
This is the dataset of Wahira Nagomi, containing 295 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 295 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 738 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 295 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 295 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 295 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 295 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 295 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 738 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 738 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 738 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
bossdjbr/mckekeu | 2023-09-22T02:57:31.000Z | [
"region:us"
] | bossdjbr | null | null | null | 0 | 0 | Entry not found |
CyberHarem/maki_gamou_donttoywithmemissnagatoro | 2023-09-22T03:02:47.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Maki Gamou
This is the dataset of Maki Gamou, containing 141 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 141 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 351 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 141 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 141 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 141 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 141 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 141 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 351 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 351 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 351 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
yzyly/lard-1500-dataset | 2023-09-22T03:08:17.000Z | [
"region:us"
] | yzyly | null | null | null | 0 | 0 | Entry not found |
CyberHarem/yoshi_donttoywithmemissnagatoro | 2023-09-22T03:11:10.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yoshi
This is the dataset of Yoshi, containing 86 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 86 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 223 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 86 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 86 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 86 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 86 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 86 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 223 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 223 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 223 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/sakura_donttoywithmemissnagatoro | 2023-09-22T03:17:32.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Sakura
This is the dataset of Sakura, containing 79 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 79 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 178 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 79 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 79 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 79 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 79 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 79 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 178 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 178 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 178 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/mannen_ranko_akibameidosensou | 2023-09-22T03:23:41.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Mannen Ranko
This is the dataset of Mannen Ranko, containing 263 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 263 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 616 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 263 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 263 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 263 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 263 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 263 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 616 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 616 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 616 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
riquinho21/vozdigo1 | 2023-09-22T03:21:18.000Z | [
"license:unknown",
"region:us"
] | riquinho21 | null | null | null | 0 | 0 | ---
license: unknown
---
|
CyberHarem/yumechi_akibameidosensou | 2023-09-22T03:38:10.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yumechi
This is the dataset of Yumechi, containing 164 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 164 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 407 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 164 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 164 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 164 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 164 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 164 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 407 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 407 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 407 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-chat-temp | 2023-09-22T03:38:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of YeungNLP/firefly-llama2-7b-chat-temp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-llama2-7b-chat-temp](https://huggingface.co/YeungNLP/firefly-llama2-7b-chat-temp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-chat-temp\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T03:37:32.448737](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-chat-temp/blob/main/results_2023-09-22T03-37-32.448737.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45675748784133063,\n\
\ \"acc_stderr\": 0.03523878979242221,\n \"acc_norm\": 0.46039529518457817,\n\
\ \"acc_norm_stderr\": 0.03522948379579862,\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179178,\n \"mc2\": 0.46775413014717326,\n\
\ \"mc2_stderr\": 0.015305512973889742\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48293515358361777,\n \"acc_stderr\": 0.0146028783885366,\n\
\ \"acc_norm\": 0.5119453924914675,\n \"acc_norm_stderr\": 0.014607220340597167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5476000796654052,\n\
\ \"acc_stderr\": 0.004967118575905287,\n \"acc_norm\": 0.7332204740091616,\n\
\ \"acc_norm_stderr\": 0.004413722823053159\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.03077090076385131,\n\
\ \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.03077090076385131\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.036186648199362466,\n\
\ \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.036186648199362466\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4774193548387097,\n\
\ \"acc_stderr\": 0.028414985019707868,\n \"acc_norm\": 0.4774193548387097,\n\
\ \"acc_norm_stderr\": 0.028414985019707868\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.03883565977956929,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.03883565977956929\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5656565656565656,\n \"acc_stderr\": 0.03531505879359183,\n \"\
acc_norm\": 0.5656565656565656,\n \"acc_norm_stderr\": 0.03531505879359183\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5751295336787565,\n \"acc_stderr\": 0.035674713352125395,\n\
\ \"acc_norm\": 0.5751295336787565,\n \"acc_norm_stderr\": 0.035674713352125395\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331803,\n\
\ \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.03181110032413925,\n\
\ \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.03181110032413925\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5834862385321101,\n \"acc_stderr\": 0.021136376504030864,\n \"\
acc_norm\": 0.5834862385321101,\n \"acc_norm_stderr\": 0.021136376504030864\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656629,\n \"\
acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656629\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5637254901960784,\n \"acc_stderr\": 0.03480693138457039,\n \"\
acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.03480693138457039\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5991561181434599,\n \"acc_stderr\": 0.031900803894732356,\n \
\ \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.031900803894732356\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n\
\ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n\
\ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179662,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179662\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03088273697413866,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03088273697413866\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5862068965517241,\n\
\ \"acc_stderr\": 0.01761220408466377,\n \"acc_norm\": 0.5862068965517241,\n\
\ \"acc_norm_stderr\": 0.01761220408466377\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5057803468208093,\n \"acc_stderr\": 0.026917296179149123,\n\
\ \"acc_norm\": 0.5057803468208093,\n \"acc_norm_stderr\": 0.026917296179149123\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.01431099954796145,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.01431099954796145\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.028624412550167958,\n\
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.028624412550167958\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5144694533762058,\n\
\ \"acc_stderr\": 0.02838619808417768,\n \"acc_norm\": 0.5144694533762058,\n\
\ \"acc_norm_stderr\": 0.02838619808417768\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \
\ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3376792698826597,\n\
\ \"acc_stderr\": 0.012078563777145574,\n \"acc_norm\": 0.3376792698826597,\n\
\ \"acc_norm_stderr\": 0.012078563777145574\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.03004261583271487,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.03004261583271487\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.44281045751633985,\n \"acc_stderr\": 0.02009508315457735,\n \
\ \"acc_norm\": 0.44281045751633985,\n \"acc_norm_stderr\": 0.02009508315457735\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.03141470802586589,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.03141470802586589\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179178,\n \"mc2\": 0.46775413014717326,\n\
\ \"mc2_stderr\": 0.015305512973889742\n }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-llama2-7b-chat-temp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|arc:challenge|25_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hellaswag|10_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-37-32.448737.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-37-32.448737.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T03-37-32.448737.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T03-37-32.448737.parquet'
- config_name: results
data_files:
- split: 2023_09_22T03_37_32.448737
path:
- results_2023-09-22T03-37-32.448737.parquet
- split: latest
path:
- results_2023-09-22T03-37-32.448737.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-llama2-7b-chat-temp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/YeungNLP/firefly-llama2-7b-chat-temp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-llama2-7b-chat-temp](https://huggingface.co/YeungNLP/firefly-llama2-7b-chat-temp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-chat-temp",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T03:37:32.448737](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-llama2-7b-chat-temp/blob/main/results_2023-09-22T03-37-32.448737.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45675748784133063,
"acc_stderr": 0.03523878979242221,
"acc_norm": 0.46039529518457817,
"acc_norm_stderr": 0.03522948379579862,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179178,
"mc2": 0.46775413014717326,
"mc2_stderr": 0.015305512973889742
},
"harness|arc:challenge|25": {
"acc": 0.48293515358361777,
"acc_stderr": 0.0146028783885366,
"acc_norm": 0.5119453924914675,
"acc_norm_stderr": 0.014607220340597167
},
"harness|hellaswag|10": {
"acc": 0.5476000796654052,
"acc_stderr": 0.004967118575905287,
"acc_norm": 0.7332204740091616,
"acc_norm_stderr": 0.004413722823053159
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5056603773584906,
"acc_stderr": 0.03077090076385131,
"acc_norm": 0.5056603773584906,
"acc_norm_stderr": 0.03077090076385131
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.1568627450980392,
"acc_stderr": 0.036186648199362466,
"acc_norm": 0.1568627450980392,
"acc_norm_stderr": 0.036186648199362466
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4774193548387097,
"acc_stderr": 0.028414985019707868,
"acc_norm": 0.4774193548387097,
"acc_norm_stderr": 0.028414985019707868
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.03883565977956929,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.03883565977956929
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5656565656565656,
"acc_stderr": 0.03531505879359183,
"acc_norm": 0.5656565656565656,
"acc_norm_stderr": 0.03531505879359183
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5751295336787565,
"acc_stderr": 0.035674713352125395,
"acc_norm": 0.5751295336787565,
"acc_norm_stderr": 0.035674713352125395
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4128205128205128,
"acc_stderr": 0.024962683564331803,
"acc_norm": 0.4128205128205128,
"acc_norm_stderr": 0.024962683564331803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.03181110032413925,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.03181110032413925
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5834862385321101,
"acc_stderr": 0.021136376504030864,
"acc_norm": 0.5834862385321101,
"acc_norm_stderr": 0.021136376504030864
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.03154696285656629,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.03154696285656629
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.03480693138457039,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.03480693138457039
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.031900803894732356,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.031900803894732356
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5112107623318386,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.5112107623318386,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179662,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179662
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03088273697413866,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03088273697413866
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.01761220408466377,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.01761220408466377
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5057803468208093,
"acc_stderr": 0.026917296179149123,
"acc_norm": 0.5057803468208093,
"acc_norm_stderr": 0.026917296179149123
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.01431099954796145,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.01431099954796145
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.028624412550167958,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.028624412550167958
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5144694533762058,
"acc_stderr": 0.02838619808417768,
"acc_norm": 0.5144694533762058,
"acc_norm_stderr": 0.02838619808417768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963764,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963764
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3376792698826597,
"acc_stderr": 0.012078563777145574,
"acc_norm": 0.3376792698826597,
"acc_norm_stderr": 0.012078563777145574
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.03004261583271487,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.03004261583271487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.44281045751633985,
"acc_stderr": 0.02009508315457735,
"acc_norm": 0.44281045751633985,
"acc_norm_stderr": 0.02009508315457735
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.03141470802586589,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.03141470802586589
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.672514619883041,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.672514619883041,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179178,
"mc2": 0.46775413014717326,
"mc2_stderr": 0.015305512973889742
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jb723__llama2-ko-7B-model | 2023-09-22T03:47:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jb723/llama2-ko-7B-model
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jb723/llama2-ko-7B-model](https://huggingface.co/jb723/llama2-ko-7B-model) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jb723__llama2-ko-7B-model\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T03:46:09.444345](https://huggingface.co/datasets/open-llm-leaderboard/details_jb723__llama2-ko-7B-model/blob/main/results_2023-09-22T03-46-09.444345.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4611255933915564,\n\
\ \"acc_stderr\": 0.0351963298015563,\n \"acc_norm\": 0.46464091811201347,\n\
\ \"acc_norm_stderr\": 0.03518095823947449,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.015744027248256045,\n \"mc2\": 0.4097811489004275,\n\
\ \"mc2_stderr\": 0.015552291335837638\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5349829351535836,\n \"acc_stderr\": 0.014575583922019674,\n\
\ \"acc_norm\": 0.5631399317406144,\n \"acc_norm_stderr\": 0.014494421584256532\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6158135829516033,\n\
\ \"acc_stderr\": 0.004854082479916911,\n \"acc_norm\": 0.7950607448715395,\n\
\ \"acc_norm_stderr\": 0.004028322654852745\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.45660377358490567,\n \"acc_stderr\": 0.030656748696739438,\n \
\ \"acc_norm\": 0.45660377358490567,\n \"acc_norm_stderr\": 0.030656748696739438\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.14705882352941177,\n \"acc_stderr\": 0.035240689515674495,\n\
\ \"acc_norm\": 0.14705882352941177,\n \"acc_norm_stderr\": 0.035240689515674495\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992072,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4935483870967742,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.4935483870967742,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5757575757575758,\n \"acc_stderr\": 0.035212249088415845,\n \"\
acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.035212249088415845\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.033403619062765864,\n\
\ \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.033403619062765864\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4461538461538462,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.4461538461538462,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478466,\n\
\ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478466\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6146788990825688,\n \"acc_stderr\": 0.02086585085279412,\n \"\
acc_norm\": 0.6146788990825688,\n \"acc_norm_stderr\": 0.02086585085279412\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257017,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257017\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239171,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239171\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610795,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610795\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n\
\ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n\
\ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44785276073619634,\n \"acc_stderr\": 0.03906947479456602,\n\
\ \"acc_norm\": 0.44785276073619634,\n \"acc_norm_stderr\": 0.03906947479456602\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.030679022765498828,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.030679022765498828\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6577266922094508,\n\
\ \"acc_stderr\": 0.016967031766413624,\n \"acc_norm\": 0.6577266922094508,\n\
\ \"acc_norm_stderr\": 0.016967031766413624\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382868,\n\
\ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382868\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.477124183006536,\n \"acc_stderr\": 0.028599936776089782,\n\
\ \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.028599936776089782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n\
\ \"acc_stderr\": 0.028217683556652315,\n \"acc_norm\": 0.5562700964630225,\n\
\ \"acc_norm_stderr\": 0.028217683556652315\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.027794760105008736,\n\
\ \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.027794760105008736\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3475177304964539,\n \"acc_stderr\": 0.02840662780959095,\n \
\ \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.02840662780959095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35528031290743156,\n\
\ \"acc_stderr\": 0.012223623364044037,\n \"acc_norm\": 0.35528031290743156,\n\
\ \"acc_norm_stderr\": 0.012223623364044037\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.03016191193076711,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.03016191193076711\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4526143790849673,\n \"acc_stderr\": 0.020136790918492523,\n \
\ \"acc_norm\": 0.4526143790849673,\n \"acc_norm_stderr\": 0.020136790918492523\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.572139303482587,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.572139303482587,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.015744027248256045,\n \"mc2\": 0.4097811489004275,\n\
\ \"mc2_stderr\": 0.015552291335837638\n }\n}\n```"
repo_url: https://huggingface.co/jb723/llama2-ko-7B-model
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|arc:challenge|25_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hellaswag|10_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-46-09.444345.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-46-09.444345.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T03-46-09.444345.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T03-46-09.444345.parquet'
- config_name: results
data_files:
- split: 2023_09_22T03_46_09.444345
path:
- results_2023-09-22T03-46-09.444345.parquet
- split: latest
path:
- results_2023-09-22T03-46-09.444345.parquet
---
# Dataset Card for Evaluation run of jb723/llama2-ko-7B-model
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jb723/llama2-ko-7B-model
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jb723/llama2-ko-7B-model](https://huggingface.co/jb723/llama2-ko-7B-model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jb723__llama2-ko-7B-model",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T03:46:09.444345](https://huggingface.co/datasets/open-llm-leaderboard/details_jb723__llama2-ko-7B-model/blob/main/results_2023-09-22T03-46-09.444345.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4611255933915564,
"acc_stderr": 0.0351963298015563,
"acc_norm": 0.46464091811201347,
"acc_norm_stderr": 0.03518095823947449,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.015744027248256045,
"mc2": 0.4097811489004275,
"mc2_stderr": 0.015552291335837638
},
"harness|arc:challenge|25": {
"acc": 0.5349829351535836,
"acc_stderr": 0.014575583922019674,
"acc_norm": 0.5631399317406144,
"acc_norm_stderr": 0.014494421584256532
},
"harness|hellaswag|10": {
"acc": 0.6158135829516033,
"acc_stderr": 0.004854082479916911,
"acc_norm": 0.7950607448715395,
"acc_norm_stderr": 0.004028322654852745
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.45660377358490567,
"acc_stderr": 0.030656748696739438,
"acc_norm": 0.45660377358490567,
"acc_norm_stderr": 0.030656748696739438
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.14705882352941177,
"acc_stderr": 0.035240689515674495,
"acc_norm": 0.14705882352941177,
"acc_norm_stderr": 0.035240689515674495
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992072,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523812,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523812
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4935483870967742,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.4935483870967742,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.035212249088415845,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.035212249088415845
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.689119170984456,
"acc_stderr": 0.033403619062765864,
"acc_norm": 0.689119170984456,
"acc_norm_stderr": 0.033403619062765864
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4461538461538462,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.4461538461538462,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6146788990825688,
"acc_stderr": 0.02086585085279412,
"acc_norm": 0.6146788990825688,
"acc_norm_stderr": 0.02086585085279412
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257017,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257017
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239171,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239171
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610795,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610795
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44785276073619634,
"acc_stderr": 0.03906947479456602,
"acc_norm": 0.44785276073619634,
"acc_norm_stderr": 0.03906947479456602
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.030679022765498828,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.030679022765498828
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6577266922094508,
"acc_stderr": 0.016967031766413624,
"acc_norm": 0.6577266922094508,
"acc_norm_stderr": 0.016967031766413624
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382868,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382868
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.477124183006536,
"acc_stderr": 0.028599936776089782,
"acc_norm": 0.477124183006536,
"acc_norm_stderr": 0.028599936776089782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.028217683556652315,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.028217683556652315
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5216049382716049,
"acc_stderr": 0.027794760105008736,
"acc_norm": 0.5216049382716049,
"acc_norm_stderr": 0.027794760105008736
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.02840662780959095,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.02840662780959095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35528031290743156,
"acc_stderr": 0.012223623364044037,
"acc_norm": 0.35528031290743156,
"acc_norm_stderr": 0.012223623364044037
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.03016191193076711,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.03016191193076711
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4526143790849673,
"acc_stderr": 0.020136790918492523,
"acc_norm": 0.4526143790849673,
"acc_norm_stderr": 0.020136790918492523
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.572139303482587,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.572139303482587,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.015744027248256045,
"mc2": 0.4097811489004275,
"mc2_stderr": 0.015552291335837638
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/shiipon_akibameidosensou | 2023-09-22T03:48:18.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Shiipon
This is the dataset of Shiipon, containing 132 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 132 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 327 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 132 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 132 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 132 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 132 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 132 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 327 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 327 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 327 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_PocketDoc__Dans-RetroRodeo-13b | 2023-09-22T03:53:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PocketDoc/Dans-RetroRodeo-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PocketDoc/Dans-RetroRodeo-13b](https://huggingface.co/PocketDoc/Dans-RetroRodeo-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-RetroRodeo-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T03:51:50.269402](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-RetroRodeo-13b/blob/main/results_2023-09-22T03-51-50.269402.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4913079134787104,\n\
\ \"acc_stderr\": 0.03513136594854453,\n \"acc_norm\": 0.49534566078833525,\n\
\ \"acc_norm_stderr\": 0.035115867628459356,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.38726662936200895,\n\
\ \"mc2_stderr\": 0.014083047614327703\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4974402730375427,\n \"acc_stderr\": 0.014611199329843784,\n\
\ \"acc_norm\": 0.53839590443686,\n \"acc_norm_stderr\": 0.01456824555029636\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5989842660824537,\n\
\ \"acc_stderr\": 0.004891025533633032,\n \"acc_norm\": 0.7962557259510058,\n\
\ \"acc_norm_stderr\": 0.004019578428155064\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.030656748696739435,\n\
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.030656748696739435\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.048523658709390974,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.048523658709390974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.032081157507886836,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03718489006818115,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03718489006818115\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6064516129032258,\n\
\ \"acc_stderr\": 0.027791878753132267,\n \"acc_norm\": 0.6064516129032258,\n\
\ \"acc_norm_stderr\": 0.027791878753132267\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n\
\ \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6161616161616161,\n\
\ \"acc_stderr\": 0.0346488167501634,\n \"acc_norm\": 0.6161616161616161,\n\
\ \"acc_norm_stderr\": 0.0346488167501634\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.0330881859441575,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.0330881859441575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.0252544854247996,\n \
\ \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.0252544854247996\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371218,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371218\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4579831932773109,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.4579831932773109,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6422018348623854,\n \"acc_stderr\": 0.020552060784827825,\n \"\
acc_norm\": 0.6422018348623854,\n \"acc_norm_stderr\": 0.020552060784827825\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6470588235294118,\n \"acc_stderr\": 0.03354092437591518,\n \"\
acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03354092437591518\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6455696202531646,\n \"acc_stderr\": 0.031137304297185805,\n \
\ \"acc_norm\": 0.6455696202531646,\n \"acc_norm_stderr\": 0.031137304297185805\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906276,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906276\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6012269938650306,\n \"acc_stderr\": 0.03847021420456023,\n\
\ \"acc_norm\": 0.6012269938650306,\n \"acc_norm_stderr\": 0.03847021420456023\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\
\ \"acc_stderr\": 0.029996951858349483,\n \"acc_norm\": 0.7008547008547008,\n\
\ \"acc_norm_stderr\": 0.029996951858349483\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6538952745849298,\n\
\ \"acc_stderr\": 0.01701196526641207,\n \"acc_norm\": 0.6538952745849298,\n\
\ \"acc_norm_stderr\": 0.01701196526641207\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.02658923114217426,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.02658923114217426\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.028452639985088016,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.028452639985088016\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946205,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946205\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49691358024691357,\n \"acc_stderr\": 0.027820214158594377,\n\
\ \"acc_norm\": 0.49691358024691357,\n \"acc_norm_stderr\": 0.027820214158594377\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759412,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759412\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39960886571056065,\n\
\ \"acc_stderr\": 0.012510181636960667,\n \"acc_norm\": 0.39960886571056065,\n\
\ \"acc_norm_stderr\": 0.012510181636960667\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.48366013071895425,\n \"acc_stderr\": 0.020217030653186457,\n \"\
acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.020217030653186457\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5673469387755102,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.5673469387755102,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.38726662936200895,\n\
\ \"mc2_stderr\": 0.014083047614327703\n }\n}\n```"
repo_url: https://huggingface.co/PocketDoc/Dans-RetroRodeo-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|arc:challenge|25_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hellaswag|10_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-51-50.269402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T03-51-50.269402.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T03-51-50.269402.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T03-51-50.269402.parquet'
- config_name: results
data_files:
- split: 2023_09_22T03_51_50.269402
path:
- results_2023-09-22T03-51-50.269402.parquet
- split: latest
path:
- results_2023-09-22T03-51-50.269402.parquet
---
# Dataset Card for Evaluation run of PocketDoc/Dans-RetroRodeo-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PocketDoc/Dans-RetroRodeo-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PocketDoc/Dans-RetroRodeo-13b](https://huggingface.co/PocketDoc/Dans-RetroRodeo-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-RetroRodeo-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T03:51:50.269402](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-RetroRodeo-13b/blob/main/results_2023-09-22T03-51-50.269402.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4913079134787104,
"acc_stderr": 0.03513136594854453,
"acc_norm": 0.49534566078833525,
"acc_norm_stderr": 0.035115867628459356,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.38726662936200895,
"mc2_stderr": 0.014083047614327703
},
"harness|arc:challenge|25": {
"acc": 0.4974402730375427,
"acc_stderr": 0.014611199329843784,
"acc_norm": 0.53839590443686,
"acc_norm_stderr": 0.01456824555029636
},
"harness|hellaswag|10": {
"acc": 0.5989842660824537,
"acc_stderr": 0.004891025533633032,
"acc_norm": 0.7962557259510058,
"acc_norm_stderr": 0.004019578428155064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.030656748696739435,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.030656748696739435
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709390974,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709390974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149138,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149138
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818115,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818115
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6064516129032258,
"acc_stderr": 0.027791878753132267,
"acc_norm": 0.6064516129032258,
"acc_norm_stderr": 0.027791878753132267
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.0346488167501634,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.0346488167501634
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.0252544854247996,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.0252544854247996
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371218,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371218
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4579831932773109,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.4579831932773109,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6422018348623854,
"acc_stderr": 0.020552060784827825,
"acc_norm": 0.6422018348623854,
"acc_norm_stderr": 0.020552060784827825
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.03354092437591518,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.03354092437591518
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6455696202531646,
"acc_stderr": 0.031137304297185805,
"acc_norm": 0.6455696202531646,
"acc_norm_stderr": 0.031137304297185805
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906276,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906276
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6012269938650306,
"acc_stderr": 0.03847021420456023,
"acc_norm": 0.6012269938650306,
"acc_norm_stderr": 0.03847021420456023
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.029996951858349483,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.029996951858349483
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6538952745849298,
"acc_stderr": 0.01701196526641207,
"acc_norm": 0.6538952745849298,
"acc_norm_stderr": 0.01701196526641207
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.028452639985088016,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.028452639985088016
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946205,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49691358024691357,
"acc_stderr": 0.027820214158594377,
"acc_norm": 0.49691358024691357,
"acc_norm_stderr": 0.027820214158594377
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759412,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759412
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39960886571056065,
"acc_stderr": 0.012510181636960667,
"acc_norm": 0.39960886571056065,
"acc_norm_stderr": 0.012510181636960667
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.020217030653186457,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.020217030653186457
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5673469387755102,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.5673469387755102,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.38726662936200895,
"mc2_stderr": 0.014083047614327703
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/zoya_akibameidosensou | 2023-09-22T03:56:27.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Zoya
This is the dataset of Zoya, containing 114 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 114 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 276 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 114 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 114 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 114 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 114 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 114 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 276 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 276 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 276 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
AA12312424/rv13-1600 | 2023-09-22T07:57:46.000Z | [
"region:us"
] | AA12312424 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/yaegashi_yasuko_akibameidosensou | 2023-09-22T04:12:46.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yaegashi Yasuko
This is the dataset of Yaegashi Yasuko, containing 156 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 156 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 372 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 156 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 156 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 156 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 156 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 156 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 372 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 372 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 372 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-1.1bee | 2023-09-22T04:14:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of BEE-spoke-data/TinyLlama-1.1bee
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BEE-spoke-data/TinyLlama-1.1bee](https://huggingface.co/BEE-spoke-data/TinyLlama-1.1bee)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-1.1bee\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T04:13:14.200799](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-1.1bee/blob/main/results_2023-09-22T04-13-14.200799.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24543328318908342,\n\
\ \"acc_stderr\": 0.031215697549559097,\n \"acc_norm\": 0.24827144678006,\n\
\ \"acc_norm_stderr\": 0.031228116271352042,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752329,\n \"mc2\": 0.39011272446721906,\n\
\ \"mc2_stderr\": 0.014482564173653177\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.26109215017064846,\n \"acc_stderr\": 0.012835523909473838,\n\
\ \"acc_norm\": 0.3054607508532423,\n \"acc_norm_stderr\": 0.013460080478002501\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.39494124676359293,\n\
\ \"acc_stderr\": 0.004878390226591721,\n \"acc_norm\": 0.5180242979486158,\n\
\ \"acc_norm_stderr\": 0.004986538243846636\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17037037037037037,\n\
\ \"acc_stderr\": 0.032477811859955935,\n \"acc_norm\": 0.17037037037037037,\n\
\ \"acc_norm_stderr\": 0.032477811859955935\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.02895734278834235,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.02895734278834235\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18387096774193548,\n\
\ \"acc_stderr\": 0.02203721734026784,\n \"acc_norm\": 0.18387096774193548,\n\
\ \"acc_norm_stderr\": 0.02203721734026784\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1921182266009852,\n \"acc_stderr\": 0.027719315709614778,\n\
\ \"acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.027719315709614778\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1717171717171717,\n \"acc_stderr\": 0.026869716187429917,\n \"\
acc_norm\": 0.1717171717171717,\n \"acc_norm_stderr\": 0.026869716187429917\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.03027690994517825,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.03027690994517825\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.19487179487179487,\n \"acc_stderr\": 0.020083167595181393,\n\
\ \"acc_norm\": 0.19487179487179487,\n \"acc_norm_stderr\": 0.020083167595181393\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958927,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21651376146788992,\n \"acc_stderr\": 0.01765871059444314,\n \"\
acc_norm\": 0.21651376146788992,\n \"acc_norm_stderr\": 0.01765871059444314\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.025416428388767485,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.025416428388767485\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841043,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841043\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.14563106796116504,\n \"acc_stderr\": 0.0349260647662379,\n\
\ \"acc_norm\": 0.14563106796116504,\n \"acc_norm_stderr\": 0.0349260647662379\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.029745048572674036,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.029745048572674036\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.227330779054917,\n\
\ \"acc_stderr\": 0.014987270640946019,\n \"acc_norm\": 0.227330779054917,\n\
\ \"acc_norm_stderr\": 0.014987270640946019\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757173,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757173\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574894,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574894\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888146,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22508038585209003,\n\
\ \"acc_stderr\": 0.02372008851617903,\n \"acc_norm\": 0.22508038585209003,\n\
\ \"acc_norm_stderr\": 0.02372008851617903\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.20679012345679013,\n \"acc_stderr\": 0.02253500670594282,\n\
\ \"acc_norm\": 0.20679012345679013,\n \"acc_norm_stderr\": 0.02253500670594282\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843003,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843003\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26988265971316816,\n\
\ \"acc_stderr\": 0.011337381084250402,\n \"acc_norm\": 0.26988265971316816,\n\
\ \"acc_norm_stderr\": 0.011337381084250402\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1801470588235294,\n \"acc_stderr\": 0.02334516361654485,\n\
\ \"acc_norm\": 0.1801470588235294,\n \"acc_norm_stderr\": 0.02334516361654485\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.30718954248366015,\n \"acc_stderr\": 0.01866335967146367,\n \
\ \"acc_norm\": 0.30718954248366015,\n \"acc_norm_stderr\": 0.01866335967146367\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.028920583220675575,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.028920583220675575\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03615507630310935,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03615507630310935\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752329,\n \"mc2\": 0.39011272446721906,\n\
\ \"mc2_stderr\": 0.014482564173653177\n }\n}\n```"
repo_url: https://huggingface.co/BEE-spoke-data/TinyLlama-1.1bee
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|arc:challenge|25_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hellaswag|10_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T04-13-14.200799.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-13-14.200799.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T04-13-14.200799.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T04-13-14.200799.parquet'
- config_name: results
data_files:
- split: 2023_09_22T04_13_14.200799
path:
- results_2023-09-22T04-13-14.200799.parquet
- split: latest
path:
- results_2023-09-22T04-13-14.200799.parquet
---
# Dataset Card for Evaluation run of BEE-spoke-data/TinyLlama-1.1bee
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/BEE-spoke-data/TinyLlama-1.1bee
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [BEE-spoke-data/TinyLlama-1.1bee](https://huggingface.co/BEE-spoke-data/TinyLlama-1.1bee) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-1.1bee",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T04:13:14.200799](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__TinyLlama-1.1bee/blob/main/results_2023-09-22T04-13-14.200799.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24543328318908342,
"acc_stderr": 0.031215697549559097,
"acc_norm": 0.24827144678006,
"acc_norm_stderr": 0.031228116271352042,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752329,
"mc2": 0.39011272446721906,
"mc2_stderr": 0.014482564173653177
},
"harness|arc:challenge|25": {
"acc": 0.26109215017064846,
"acc_stderr": 0.012835523909473838,
"acc_norm": 0.3054607508532423,
"acc_norm_stderr": 0.013460080478002501
},
"harness|hellaswag|10": {
"acc": 0.39494124676359293,
"acc_stderr": 0.004878390226591721,
"acc_norm": 0.5180242979486158,
"acc_norm_stderr": 0.004986538243846636
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17037037037037037,
"acc_stderr": 0.032477811859955935,
"acc_norm": 0.17037037037037037,
"acc_norm_stderr": 0.032477811859955935
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.02895734278834235,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.02895734278834235
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.02203721734026784,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.02203721734026784
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.027719315709614778,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.027719315709614778
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1717171717171717,
"acc_stderr": 0.026869716187429917,
"acc_norm": 0.1717171717171717,
"acc_norm_stderr": 0.026869716187429917
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.03027690994517825,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.03027690994517825
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.19487179487179487,
"acc_stderr": 0.020083167595181393,
"acc_norm": 0.19487179487179487,
"acc_norm_stderr": 0.020083167595181393
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958927,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21651376146788992,
"acc_stderr": 0.01765871059444314,
"acc_norm": 0.21651376146788992,
"acc_norm_stderr": 0.01765871059444314
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.025416428388767485,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.025416428388767485
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841043,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841043
},
"harness|hendrycksTest-management|5": {
"acc": 0.14563106796116504,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.14563106796116504,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.029745048572674036,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.029745048572674036
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.227330779054917,
"acc_stderr": 0.014987270640946019,
"acc_norm": 0.227330779054917,
"acc_norm_stderr": 0.014987270640946019
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757173,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757173
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574894,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574894
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22508038585209003,
"acc_stderr": 0.02372008851617903,
"acc_norm": 0.22508038585209003,
"acc_norm_stderr": 0.02372008851617903
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20679012345679013,
"acc_stderr": 0.02253500670594282,
"acc_norm": 0.20679012345679013,
"acc_norm_stderr": 0.02253500670594282
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843003,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843003
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26988265971316816,
"acc_stderr": 0.011337381084250402,
"acc_norm": 0.26988265971316816,
"acc_norm_stderr": 0.011337381084250402
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1801470588235294,
"acc_stderr": 0.02334516361654485,
"acc_norm": 0.1801470588235294,
"acc_norm_stderr": 0.02334516361654485
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.30718954248366015,
"acc_stderr": 0.01866335967146367,
"acc_norm": 0.30718954248366015,
"acc_norm_stderr": 0.01866335967146367
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.028920583220675575,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.028920583220675575
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752329,
"mc2": 0.39011272446721906,
"mc2_stderr": 0.014482564173653177
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
legacy107/qa_wikipedia_retrieved_chunks-og | 2023-09-22T04:20:01.000Z | [
"region:us"
] | legacy107 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer_start
dtype: int64
- name: answer
dtype: string
- name: article
dtype: string
- name: retrieved_context
dtype: string
splits:
- name: train
num_bytes: 6210772758
num_examples: 110970
- name: validation
num_bytes: 732036635
num_examples: 13833
- name: test
num_bytes: 762734936
num_examples: 13873
download_size: 417751805
dataset_size: 7705544329
---
# Dataset Card for "qa_wikipedia_retrieved_chunks-og"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/nagi_akibameidosensou | 2023-09-22T04:20:49.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Nagi
This is the dataset of Nagi, containing 70 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 70 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 167 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 70 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 70 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 70 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 70 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 70 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 167 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 167 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 167 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
philrwebb/SolutionArchitectInstruct | 2023-09-22T04:19:57.000Z | [
"region:us"
] | philrwebb | null | null | null | 0 | 0 | Entry not found |
ty-kim/test2 | 2023-09-22T04:42:23.000Z | [
"region:us"
] | ty-kim | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_cds | 2023-09-22T04:47:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of DevaMalla/llama_7b_qlora_cds
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DevaMalla/llama_7b_qlora_cds](https://huggingface.co/DevaMalla/llama_7b_qlora_cds)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_cds\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T04:45:53.038804](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_cds/blob/main/results_2023-09-22T04-45-53.038804.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3311519760109131,\n\
\ \"acc_stderr\": 0.03370272331161324,\n \"acc_norm\": 0.3349306016471848,\n\
\ \"acc_norm_stderr\": 0.033689358841766895,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.4613848175485234,\n\
\ \"mc2_stderr\": 0.014633300293081979\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49658703071672355,\n \"acc_stderr\": 0.014611050403244081,\n\
\ \"acc_norm\": 0.5247440273037542,\n \"acc_norm_stderr\": 0.014593487694937738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5828520215096594,\n\
\ \"acc_stderr\": 0.00492080031323274,\n \"acc_norm\": 0.7776339374626569,\n\
\ \"acc_norm_stderr\": 0.004149859300604921\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.362962962962963,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610625,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610625\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.36981132075471695,\n \"acc_stderr\": 0.029711421880107915,\n\
\ \"acc_norm\": 0.36981132075471695,\n \"acc_norm_stderr\": 0.029711421880107915\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n\
\ \"acc_stderr\": 0.03435568056047875,\n \"acc_norm\": 0.2832369942196532,\n\
\ \"acc_norm_stderr\": 0.03435568056047875\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307811,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307811\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3574468085106383,\n \"acc_stderr\": 0.03132941789476425,\n\
\ \"acc_norm\": 0.3574468085106383,\n \"acc_norm_stderr\": 0.03132941789476425\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.0360010569272777,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.0360010569272777\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.02306818884826111,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02306818884826111\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3193548387096774,\n \"acc_stderr\": 0.02652270967466777,\n \"\
acc_norm\": 0.3193548387096774,\n \"acc_norm_stderr\": 0.02652270967466777\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"\
acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398395,\n\
\ \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2878787878787879,\n \"acc_stderr\": 0.03225883512300993,\n \"\
acc_norm\": 0.2878787878787879,\n \"acc_norm_stderr\": 0.03225883512300993\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.034902055920485744,\n\
\ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.034902055920485744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2717948717948718,\n \"acc_stderr\": 0.02255655101013235,\n \
\ \"acc_norm\": 0.2717948717948718,\n \"acc_norm_stderr\": 0.02255655101013235\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.02921354941437217,\n \
\ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.02921354941437217\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.03216298420593613,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.03216298420593613\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3743119266055046,\n \"acc_stderr\": 0.02074895940898833,\n \"\
acc_norm\": 0.3743119266055046,\n \"acc_norm_stderr\": 0.02074895940898833\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18518518518518517,\n \"acc_stderr\": 0.026491914727355157,\n \"\
acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.026491914727355157\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3382352941176471,\n \"acc_stderr\": 0.03320574612945431,\n \"\
acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.03320574612945431\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3459915611814346,\n \"acc_stderr\": 0.03096481058878671,\n \
\ \"acc_norm\": 0.3459915611814346,\n \"acc_norm_stderr\": 0.03096481058878671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n\
\ \"acc_stderr\": 0.03318833286217281,\n \"acc_norm\": 0.4260089686098655,\n\
\ \"acc_norm_stderr\": 0.03318833286217281\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3053435114503817,\n \"acc_stderr\": 0.04039314978724562,\n\
\ \"acc_norm\": 0.3053435114503817,\n \"acc_norm_stderr\": 0.04039314978724562\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.04529146804435792,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.04529146804435792\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.04691521224077741,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.04691521224077741\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.36809815950920244,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.36809815950920244,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.04620284082280039,\n\
\ \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.04620284082280039\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.43162393162393164,\n\
\ \"acc_stderr\": 0.0324483553531149,\n \"acc_norm\": 0.43162393162393164,\n\
\ \"acc_norm_stderr\": 0.0324483553531149\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.39208173690932313,\n\
\ \"acc_stderr\": 0.017458524050147636,\n \"acc_norm\": 0.39208173690932313,\n\
\ \"acc_norm_stderr\": 0.017458524050147636\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.36127167630057805,\n \"acc_stderr\": 0.025862201852277895,\n\
\ \"acc_norm\": 0.36127167630057805,\n \"acc_norm_stderr\": 0.025862201852277895\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.34967320261437906,\n \"acc_stderr\": 0.0273053080762747,\n\
\ \"acc_norm\": 0.34967320261437906,\n \"acc_norm_stderr\": 0.0273053080762747\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.0254942593506949,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.0254942593506949\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.36419753086419754,\n \"acc_stderr\": 0.026774929899722327,\n\
\ \"acc_norm\": 0.36419753086419754,\n \"acc_norm_stderr\": 0.026774929899722327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2848761408083442,\n\
\ \"acc_stderr\": 0.011527830846369016,\n \"acc_norm\": 0.2848761408083442,\n\
\ \"acc_norm_stderr\": 0.011527830846369016\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.028418208619406794,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.028418208619406794\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3480392156862745,\n \"acc_stderr\": 0.019270998708223974,\n \
\ \"acc_norm\": 0.3480392156862745,\n \"acc_norm_stderr\": 0.019270998708223974\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.4090909090909091,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.02721283588407315,\n\
\ \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.02721283588407315\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683228,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683228\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.47953216374269003,\n \"acc_stderr\": 0.038316105328219316,\n\
\ \"acc_norm\": 0.47953216374269003,\n \"acc_norm_stderr\": 0.038316105328219316\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.4613848175485234,\n\
\ \"mc2_stderr\": 0.014633300293081979\n }\n}\n```"
repo_url: https://huggingface.co/DevaMalla/llama_7b_qlora_cds
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|arc:challenge|25_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hellaswag|10_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T04-45-53.038804.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T04-45-53.038804.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T04-45-53.038804.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T04-45-53.038804.parquet'
- config_name: results
data_files:
- split: 2023_09_22T04_45_53.038804
path:
- results_2023-09-22T04-45-53.038804.parquet
- split: latest
path:
- results_2023-09-22T04-45-53.038804.parquet
---
# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora_cds
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DevaMalla/llama_7b_qlora_cds
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_qlora_cds](https://huggingface.co/DevaMalla/llama_7b_qlora_cds) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_cds",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T04:45:53.038804](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_cds/blob/main/results_2023-09-22T04-45-53.038804.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3311519760109131,
"acc_stderr": 0.03370272331161324,
"acc_norm": 0.3349306016471848,
"acc_norm_stderr": 0.033689358841766895,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.4613848175485234,
"mc2_stderr": 0.014633300293081979
},
"harness|arc:challenge|25": {
"acc": 0.49658703071672355,
"acc_stderr": 0.014611050403244081,
"acc_norm": 0.5247440273037542,
"acc_norm_stderr": 0.014593487694937738
},
"harness|hellaswag|10": {
"acc": 0.5828520215096594,
"acc_stderr": 0.00492080031323274,
"acc_norm": 0.7776339374626569,
"acc_norm_stderr": 0.004149859300604921
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610625,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610625
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.36981132075471695,
"acc_stderr": 0.029711421880107915,
"acc_norm": 0.36981132075471695,
"acc_norm_stderr": 0.029711421880107915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.03435568056047875,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.03435568056047875
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307811,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307811
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3574468085106383,
"acc_stderr": 0.03132941789476425,
"acc_norm": 0.3574468085106383,
"acc_norm_stderr": 0.03132941789476425
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.0360010569272777,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.0360010569272777
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02306818884826111,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02306818884826111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3193548387096774,
"acc_stderr": 0.02652270967466777,
"acc_norm": 0.3193548387096774,
"acc_norm_stderr": 0.02652270967466777
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398395,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.03851716319398395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2878787878787879,
"acc_stderr": 0.03225883512300993,
"acc_norm": 0.2878787878787879,
"acc_norm_stderr": 0.03225883512300993
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.034902055920485744,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.034902055920485744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2717948717948718,
"acc_stderr": 0.02255655101013235,
"acc_norm": 0.2717948717948718,
"acc_norm_stderr": 0.02255655101013235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2815126050420168,
"acc_stderr": 0.02921354941437217,
"acc_norm": 0.2815126050420168,
"acc_norm_stderr": 0.02921354941437217
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.03216298420593613,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.03216298420593613
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3743119266055046,
"acc_stderr": 0.02074895940898833,
"acc_norm": 0.3743119266055046,
"acc_norm_stderr": 0.02074895940898833
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.026491914727355157,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.026491914727355157
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.03320574612945431,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.03320574612945431
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3459915611814346,
"acc_stderr": 0.03096481058878671,
"acc_norm": 0.3459915611814346,
"acc_norm_stderr": 0.03096481058878671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4260089686098655,
"acc_stderr": 0.03318833286217281,
"acc_norm": 0.4260089686098655,
"acc_norm_stderr": 0.03318833286217281
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3053435114503817,
"acc_stderr": 0.04039314978724562,
"acc_norm": 0.3053435114503817,
"acc_norm_stderr": 0.04039314978724562
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.04529146804435792,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.04529146804435792
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.04691521224077741,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.04691521224077741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.36809815950920244,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.36809815950920244,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.04620284082280039,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.04620284082280039
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.43162393162393164,
"acc_stderr": 0.0324483553531149,
"acc_norm": 0.43162393162393164,
"acc_norm_stderr": 0.0324483553531149
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.39208173690932313,
"acc_stderr": 0.017458524050147636,
"acc_norm": 0.39208173690932313,
"acc_norm_stderr": 0.017458524050147636
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.36127167630057805,
"acc_stderr": 0.025862201852277895,
"acc_norm": 0.36127167630057805,
"acc_norm_stderr": 0.025862201852277895
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.34967320261437906,
"acc_stderr": 0.0273053080762747,
"acc_norm": 0.34967320261437906,
"acc_norm_stderr": 0.0273053080762747
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.0254942593506949,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.0254942593506949
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.36419753086419754,
"acc_stderr": 0.026774929899722327,
"acc_norm": 0.36419753086419754,
"acc_norm_stderr": 0.026774929899722327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2848761408083442,
"acc_stderr": 0.011527830846369016,
"acc_norm": 0.2848761408083442,
"acc_norm_stderr": 0.011527830846369016
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.028418208619406794,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.028418208619406794
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3480392156862745,
"acc_stderr": 0.019270998708223974,
"acc_norm": 0.3480392156862745,
"acc_norm_stderr": 0.019270998708223974
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.02721283588407315,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.02721283588407315
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03333333333333334,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03333333333333334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683228,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683228
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.47953216374269003,
"acc_stderr": 0.038316105328219316,
"acc_norm": 0.47953216374269003,
"acc_norm_stderr": 0.038316105328219316
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.4613848175485234,
"mc2_stderr": 0.014633300293081979
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TheVarunKaushik/Valorant_Advice | 2023-09-22T05:06:46.000Z | [
"license:openrail",
"region:us"
] | TheVarunKaushik | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_TinyPixel__elm-test | 2023-09-22T05:14:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TinyPixel/elm-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TinyPixel/elm-test](https://huggingface.co/TinyPixel/elm-test) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TinyPixel__elm-test\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T05:13:08.764414](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__elm-test/blob/main/results_2023-09-22T05-13-08.764414.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47299833231682314,\n\
\ \"acc_stderr\": 0.03534398633986979,\n \"acc_norm\": 0.4768955666399029,\n\
\ \"acc_norm_stderr\": 0.0353292655095149,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476199,\n \"mc2\": 0.39505922754623324,\n\
\ \"mc2_stderr\": 0.01379379444493236\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \
\ \"acc_norm\": 0.5315699658703071,\n \"acc_norm_stderr\": 0.014582236460866977\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5914160525791675,\n\
\ \"acc_stderr\": 0.004905674408614026,\n \"acc_norm\": 0.7897829117705636,\n\
\ \"acc_norm_stderr\": 0.004066299761478493\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.033959703819985726,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.033959703819985726\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.48484848484848486,\n \"acc_stderr\": 0.0356071651653106,\n \"\
acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.0356071651653106\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.0330881859441575,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.0330881859441575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240634,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240634\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.636697247706422,\n \"acc_stderr\": 0.020620603919625804,\n \"\
acc_norm\": 0.636697247706422,\n \"acc_norm_stderr\": 0.020620603919625804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.03054674526495318,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03054674526495318\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015476,\n \"\
acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015476\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.031450686007448596,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.031450686007448596\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.03023638994217309,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.03023638994217309\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.648786717752235,\n\
\ \"acc_stderr\": 0.01706998205149943,\n \"acc_norm\": 0.648786717752235,\n\
\ \"acc_norm_stderr\": 0.01706998205149943\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.026918645383239004,\n\
\ \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.026918645383239004\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \
\ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35853976531942633,\n\
\ \"acc_stderr\": 0.012248487319682734,\n \"acc_norm\": 0.35853976531942633,\n\
\ \"acc_norm_stderr\": 0.012248487319682734\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213528,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213528\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.44607843137254904,\n \"acc_stderr\": 0.020109864547181354,\n \
\ \"acc_norm\": 0.44607843137254904,\n \"acc_norm_stderr\": 0.020109864547181354\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.031987615467631264,\n\
\ \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.031987615467631264\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476199,\n \"mc2\": 0.39505922754623324,\n\
\ \"mc2_stderr\": 0.01379379444493236\n }\n}\n```"
repo_url: https://huggingface.co/TinyPixel/elm-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|arc:challenge|25_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hellaswag|10_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T05-13-08.764414.parquet'
- config_name: results
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- results_2023-09-22T05-13-08.764414.parquet
- split: latest
path:
- results_2023-09-22T05-13-08.764414.parquet
---
# Dataset Card for Evaluation run of TinyPixel/elm-test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TinyPixel/elm-test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TinyPixel/elm-test](https://huggingface.co/TinyPixel/elm-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TinyPixel__elm-test",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T05:13:08.764414](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__elm-test/blob/main/results_2023-09-22T05-13-08.764414.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47299833231682314,
"acc_stderr": 0.03534398633986979,
"acc_norm": 0.4768955666399029,
"acc_norm_stderr": 0.0353292655095149,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476199,
"mc2": 0.39505922754623324,
"mc2_stderr": 0.01379379444493236
},
"harness|arc:challenge|25": {
"acc": 0.5,
"acc_stderr": 0.014611390804670088,
"acc_norm": 0.5315699658703071,
"acc_norm_stderr": 0.014582236460866977
},
"harness|hellaswag|10": {
"acc": 0.5914160525791675,
"acc_stderr": 0.004905674408614026,
"acc_norm": 0.7897829117705636,
"acc_norm_stderr": 0.004066299761478493
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.033959703819985726,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.033959703819985726
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.0356071651653106,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.0356071651653106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240634,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240634
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.636697247706422,
"acc_stderr": 0.020620603919625804,
"acc_norm": 0.636697247706422,
"acc_norm_stderr": 0.020620603919625804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03054674526495318,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03054674526495318
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03495624522015476,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03495624522015476
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.031450686007448596,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.031450686007448596
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.03023638994217309,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.03023638994217309
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.648786717752235,
"acc_stderr": 0.01706998205149943,
"acc_norm": 0.648786717752235,
"acc_norm_stderr": 0.01706998205149943
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.026918645383239004,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.026918645383239004
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963764,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963764
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35853976531942633,
"acc_stderr": 0.012248487319682734,
"acc_norm": 0.35853976531942633,
"acc_norm_stderr": 0.012248487319682734
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.44607843137254904,
"acc_stderr": 0.020109864547181354,
"acc_norm": 0.44607843137254904,
"acc_norm_stderr": 0.020109864547181354
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476199,
"mc2": 0.39505922754623324,
"mc2_stderr": 0.01379379444493236
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_glaiveai__glaive-coder-7b | 2023-09-22T05:34:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of glaiveai/glaive-coder-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [glaiveai/glaive-coder-7b](https://huggingface.co/glaiveai/glaive-coder-7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_glaiveai__glaive-coder-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T05:33:12.124557](https://huggingface.co/datasets/open-llm-leaderboard/details_glaiveai__glaive-coder-7b/blob/main/results_2023-09-22T05-33-12.124557.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.37360254434465706,\n\
\ \"acc_stderr\": 0.034812375630621666,\n \"acc_norm\": 0.37712834453364064,\n\
\ \"acc_norm_stderr\": 0.03481357920684897,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520674,\n \"mc2\": 0.39881009957099056,\n\
\ \"mc2_stderr\": 0.01553461726038253\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3771331058020478,\n \"acc_stderr\": 0.014163366896192589,\n\
\ \"acc_norm\": 0.42662116040955633,\n \"acc_norm_stderr\": 0.014453185592920293\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48834893447520417,\n\
\ \"acc_stderr\": 0.004988426528513012,\n \"acc_norm\": 0.6468830910177256,\n\
\ \"acc_norm_stderr\": 0.004769618829196517\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3618421052631579,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.3618421052631579,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3660377358490566,\n \"acc_stderr\": 0.02964781353936525,\n\
\ \"acc_norm\": 0.3660377358490566,\n \"acc_norm_stderr\": 0.02964781353936525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3541666666666667,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.3541666666666667,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\
\ \"acc_stderr\": 0.034140140070440354,\n \"acc_norm\": 0.2774566473988439,\n\
\ \"acc_norm_stderr\": 0.034140140070440354\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.34893617021276596,\n \"acc_stderr\": 0.03115852213135778,\n\
\ \"acc_norm\": 0.34893617021276596,\n \"acc_norm_stderr\": 0.03115852213135778\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3931034482758621,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.3931034482758621,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.02369541500946309,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.02369541500946309\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2967741935483871,\n \"acc_stderr\": 0.025988500792411905,\n \"\
acc_norm\": 0.2967741935483871,\n \"acc_norm_stderr\": 0.025988500792411905\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n \"\
acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398395,\n\
\ \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.45454545454545453,\n \"acc_stderr\": 0.03547601494006936,\n \"\
acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03547601494006936\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.38341968911917096,\n \"acc_stderr\": 0.03508984236295341,\n\
\ \"acc_norm\": 0.38341968911917096,\n \"acc_norm_stderr\": 0.03508984236295341\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2717948717948718,\n \"acc_stderr\": 0.022556551010132354,\n\
\ \"acc_norm\": 0.2717948717948718,\n \"acc_norm_stderr\": 0.022556551010132354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275798,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275798\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.42201834862385323,\n \"acc_stderr\": 0.021174991407763178,\n \"\
acc_norm\": 0.42201834862385323,\n \"acc_norm_stderr\": 0.021174991407763178\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4264705882352941,\n \"acc_stderr\": 0.034711579079534254,\n \"\
acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.034711579079534254\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4978902953586498,\n \"acc_stderr\": 0.032546938018020076,\n \
\ \"acc_norm\": 0.4978902953586498,\n \"acc_norm_stderr\": 0.032546938018020076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4439461883408072,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.4439461883408072,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.40458015267175573,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.40458015267175573,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4793388429752066,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.4793388429752066,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.37423312883435583,\n \"acc_stderr\": 0.03802068102899614,\n\
\ \"acc_norm\": 0.37423312883435583,\n \"acc_norm_stderr\": 0.03802068102899614\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4368932038834951,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.4368932038834951,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n\
\ \"acc_stderr\": 0.03240847393516327,\n \"acc_norm\": 0.5726495726495726,\n\
\ \"acc_norm_stderr\": 0.03240847393516327\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.43167305236270753,\n\
\ \"acc_stderr\": 0.01771222893929979,\n \"acc_norm\": 0.43167305236270753,\n\
\ \"acc_norm_stderr\": 0.01771222893929979\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4277456647398844,\n \"acc_stderr\": 0.026636539741116082,\n\
\ \"acc_norm\": 0.4277456647398844,\n \"acc_norm_stderr\": 0.026636539741116082\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859924,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859924\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3758169934640523,\n \"acc_stderr\": 0.02773283435336395,\n\
\ \"acc_norm\": 0.3758169934640523,\n \"acc_norm_stderr\": 0.02773283435336395\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4115755627009646,\n\
\ \"acc_stderr\": 0.027950481494401266,\n \"acc_norm\": 0.4115755627009646,\n\
\ \"acc_norm_stderr\": 0.027950481494401266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3765432098765432,\n \"acc_stderr\": 0.02695934451874779,\n\
\ \"acc_norm\": 0.3765432098765432,\n \"acc_norm_stderr\": 0.02695934451874779\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30851063829787234,\n \"acc_stderr\": 0.027553366165101362,\n \
\ \"acc_norm\": 0.30851063829787234,\n \"acc_norm_stderr\": 0.027553366165101362\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2770534550195567,\n\
\ \"acc_stderr\": 0.011430462443719681,\n \"acc_norm\": 0.2770534550195567,\n\
\ \"acc_norm_stderr\": 0.011430462443719681\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487414,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487414\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.36764705882352944,\n \"acc_stderr\": 0.019506291693954847,\n \
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.019506291693954847\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.39090909090909093,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.39090909090909093,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4163265306122449,\n \"acc_stderr\": 0.031557828165561644,\n\
\ \"acc_norm\": 0.4163265306122449,\n \"acc_norm_stderr\": 0.031557828165561644\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.38308457711442784,\n\
\ \"acc_stderr\": 0.034375193373382504,\n \"acc_norm\": 0.38308457711442784,\n\
\ \"acc_norm_stderr\": 0.034375193373382504\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5146198830409356,\n \"acc_stderr\": 0.038331852752130254,\n\
\ \"acc_norm\": 0.5146198830409356,\n \"acc_norm_stderr\": 0.038331852752130254\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520674,\n \"mc2\": 0.39881009957099056,\n\
\ \"mc2_stderr\": 0.01553461726038253\n }\n}\n```"
repo_url: https://huggingface.co/glaiveai/glaive-coder-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|arc:challenge|25_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hellaswag|10_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-33-12.124557.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T05-33-12.124557.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T05-33-12.124557.parquet'
- config_name: results
data_files:
- split: 2023_09_22T05_33_12.124557
path:
- results_2023-09-22T05-33-12.124557.parquet
- split: latest
path:
- results_2023-09-22T05-33-12.124557.parquet
---
# Dataset Card for Evaluation run of glaiveai/glaive-coder-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/glaiveai/glaive-coder-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [glaiveai/glaive-coder-7b](https://huggingface.co/glaiveai/glaive-coder-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_glaiveai__glaive-coder-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T05:33:12.124557](https://huggingface.co/datasets/open-llm-leaderboard/details_glaiveai__glaive-coder-7b/blob/main/results_2023-09-22T05-33-12.124557.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.37360254434465706,
"acc_stderr": 0.034812375630621666,
"acc_norm": 0.37712834453364064,
"acc_norm_stderr": 0.03481357920684897,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520674,
"mc2": 0.39881009957099056,
"mc2_stderr": 0.01553461726038253
},
"harness|arc:challenge|25": {
"acc": 0.3771331058020478,
"acc_stderr": 0.014163366896192589,
"acc_norm": 0.42662116040955633,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.48834893447520417,
"acc_stderr": 0.004988426528513012,
"acc_norm": 0.6468830910177256,
"acc_norm_stderr": 0.004769618829196517
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3618421052631579,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.3618421052631579,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3660377358490566,
"acc_stderr": 0.02964781353936525,
"acc_norm": 0.3660377358490566,
"acc_norm_stderr": 0.02964781353936525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3541666666666667,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.3541666666666667,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.034140140070440354,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.034140140070440354
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.34893617021276596,
"acc_stderr": 0.03115852213135778,
"acc_norm": 0.34893617021276596,
"acc_norm_stderr": 0.03115852213135778
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3931034482758621,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.3931034482758621,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.02369541500946309,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.02369541500946309
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.025988500792411905,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.025988500792411905
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398395,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.03851716319398395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.03547601494006936,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.03547601494006936
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.38341968911917096,
"acc_stderr": 0.03508984236295341,
"acc_norm": 0.38341968911917096,
"acc_norm_stderr": 0.03508984236295341
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2717948717948718,
"acc_stderr": 0.022556551010132354,
"acc_norm": 0.2717948717948718,
"acc_norm_stderr": 0.022556551010132354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275798,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275798
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.42201834862385323,
"acc_stderr": 0.021174991407763178,
"acc_norm": 0.42201834862385323,
"acc_norm_stderr": 0.021174991407763178
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298825,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298825
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.034711579079534254,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.034711579079534254
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4978902953586498,
"acc_stderr": 0.032546938018020076,
"acc_norm": 0.4978902953586498,
"acc_norm_stderr": 0.032546938018020076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4439461883408072,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.4439461883408072,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.40458015267175573,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.40458015267175573,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4793388429752066,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.4793388429752066,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.37423312883435583,
"acc_stderr": 0.03802068102899614,
"acc_norm": 0.37423312883435583,
"acc_norm_stderr": 0.03802068102899614
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.4368932038834951,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.4368932038834951,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.03240847393516327,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.03240847393516327
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.43167305236270753,
"acc_stderr": 0.01771222893929979,
"acc_norm": 0.43167305236270753,
"acc_norm_stderr": 0.01771222893929979
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.026636539741116082,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.026636539741116082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859924,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859924
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3758169934640523,
"acc_stderr": 0.02773283435336395,
"acc_norm": 0.3758169934640523,
"acc_norm_stderr": 0.02773283435336395
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4115755627009646,
"acc_stderr": 0.027950481494401266,
"acc_norm": 0.4115755627009646,
"acc_norm_stderr": 0.027950481494401266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3765432098765432,
"acc_stderr": 0.02695934451874779,
"acc_norm": 0.3765432098765432,
"acc_norm_stderr": 0.02695934451874779
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30851063829787234,
"acc_stderr": 0.027553366165101362,
"acc_norm": 0.30851063829787234,
"acc_norm_stderr": 0.027553366165101362
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2770534550195567,
"acc_stderr": 0.011430462443719681,
"acc_norm": 0.2770534550195567,
"acc_norm_stderr": 0.011430462443719681
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.024060599423487414,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.024060599423487414
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.019506291693954847,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.019506291693954847
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.39090909090909093,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.39090909090909093,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4163265306122449,
"acc_stderr": 0.031557828165561644,
"acc_norm": 0.4163265306122449,
"acc_norm_stderr": 0.031557828165561644
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.38308457711442784,
"acc_stderr": 0.034375193373382504,
"acc_norm": 0.38308457711442784,
"acc_norm_stderr": 0.034375193373382504
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5146198830409356,
"acc_stderr": 0.038331852752130254,
"acc_norm": 0.5146198830409356,
"acc_norm_stderr": 0.038331852752130254
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520674,
"mc2": 0.39881009957099056,
"mc2_stderr": 0.01553461726038253
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mboth/warmeVersorgen-50-undersampled | 2023-09-22T05:42:29.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Beziehen
'1': Erzeugen
'2': Speichern
'3': Verteilen
splits:
- name: train
num_bytes: 39397.008169573855
num_examples: 200
- name: test
num_bytes: 447086
num_examples: 2265
- name: valid
num_bytes: 447086
num_examples: 2265
download_size: 342904
dataset_size: 933569.0081695738
---
# Dataset Card for "warmeVersorgen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/waermeVersorgen-100-undersampled | 2023-09-22T05:42:33.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Beziehen
'1': Erzeugen
'2': Speichern
'3': Verteilen
splits:
- name: train
num_bytes: 78794.01633914771
num_examples: 400
- name: test
num_bytes: 447086
num_examples: 2265
- name: valid
num_bytes: 447086
num_examples: 2265
download_size: 355050
dataset_size: 972966.0163391477
---
# Dataset Card for "waermeVersorgen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/waermeVersorgen-200-undersampled | 2023-09-22T05:42:37.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Beziehen
'1': Erzeugen
'2': Speichern
'3': Verteilen
splits:
- name: train
num_bytes: 144390.03494148818
num_examples: 733
- name: test
num_bytes: 447086
num_examples: 2265
- name: valid
num_bytes: 447086
num_examples: 2265
download_size: 374039
dataset_size: 1038562.0349414882
---
# Dataset Card for "waermeVersorgen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marthazh/sanhuanzhonghuinew | 2023-09-22T05:54:21.000Z | [
"region:us"
] | marthazh | null | null | null | 0 | 0 | Entry not found |
mboth/luftVersorgen-50-undersampled | 2023-09-22T05:58:58.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': LuftBereitstellen
'1': LuftVerteilen
splits:
- name: train
num_bytes: 19757.430602572782
num_examples: 100
- name: test
num_bytes: 290707
num_examples: 1477
- name: valid
num_bytes: 290707
num_examples: 1477
download_size: 227539
dataset_size: 601171.4306025729
---
# Dataset Card for "luftVersorgen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.